Cybersecurity firm Cato reports the new AI-powered tool primarily targets cryptocurrency exchanges and banks, marking “a new level of sophistication” in fraud.
Cybersecurity company Cato Networks said that a new AI-powered deepfake tool called ProKYC, which enables terrible actors to get beyond strict Know Your Customer (KYC) regulations on cryptocurrency exchanges, shows a “new level of sophistication” in crypto fraud.
Etay Maor, chief security strategist at Cato Networks, stated in a paper published on October 9 that the new AI tool significantly improved over the antiquated techniques used by fraudsters to get beyond KYC and two-factor authentication.
Fraudsters can create new identities using AI-powered tools rather than buying falsified ID documents on the dark web.
According to Cato, the new artificial intelligence tool was created for use against cryptocurrency exchanges and financial institutions whose KYC procedures include comparing a new user’s face from a webcam photo to a government-issued identification document, such as a driver’s license or passport.
ProKYC’s video showed how the program could create deepfake movies and phony ID documents to pass the facial recognition tests used by one of the biggest cryptocurrency exchanges in the world.
In the video, the user builds a deepfake image of a face using artificial intelligence and incorporates it into a template for an Australian passport.
The Artificial intelligence -generated figure is then deepfaked in an accompanying video and image using the ProKYC tool, which is then utilized to successfully get over the KYC procedures on the cryptocurrency exchange Bybit, which is situated in Dubai.
Threat actors are now significantly more competent in committing new account fraud (also known as NAF) on cryptocurrency exchanges, according to Cato, thanks to AI-powered solutions like ProKYC.
For $629, a yearly subscription to the ProKYC website includes a bundle comprising a camera, virtual emulator, face animation, fingerprints, and the creation of verification photos. It also says it can get around KYC for payment services like Revolut and Stripe, not only cryptocurrency exchanges.
According to Maor, it can be difficult to effectively identify and prevent this new type of AI fraud since overly stringent measures may result in false positives. At the same time, any flaws would let dishonest actors operate on the internet.
“Creating biometric authentication systems that are super restrictive can result in many false-positive alerts. On the other hand, lax controls can result in fraud.”
Nevertheless, these AI systems may employ detection techniques, some of which depend on human intervention to detect anomalously high-quality photos and videos, irregular facial expressions, and image quality irregularities.
Depending on the type and severity of the offense, identity fraud can carry harsh consequences in the US. The maximum punishment includes severe penalties and up to 15 years in jail.
Software giant Gen Digital, which owns antivirus companies Norton, Avast, and Avira, revealed in September that, over the past ten months, there has been a noticeable increase in the number of cryptocurrency scammers employing deepfake AI films to trick people into participating in fraudulent token schemes.
Optimistic Rollups and ZK Rollups have emerged as market leaders, contending for dominance through various approaches to scalability, speed, and…
Shytoshi Kusama hints at the launch of the TREAT token, aimed at strengthening Shiba Inu’s ecosystem with trust, governance, and…
SHIB's burn rate soared over 4100% today as the crypto market rallied, with leading analysts suggesting a potential Shiba Inu…
Analysts predict Ether price could reach a $20,000 cycle top, with momentum building in early 2025. In the upcoming weeks,…
Elon Musk scored a significant win against the US SEC as the court rejected the Commission's request to sanction him.…
Coin Center notes that the Trump administration favors crypto but warns that ongoing cases may pose challenges for investors and…