In recent times, with the advances in AI technology, crime in the crypto space has been on the rise as scammers use deepfakes to trick victims.
Elliptic’s most recent report observed that crypto fraudsters employ advanced artificial intelligence (AI) technology, which has initiated a new era of cyber threats. These criminals conduct state-sponsored attacks, deepfake scams, and other illicit activities through AI.
The report emphasizes a perplexing advertisement for an “unethical” generative pre-trained transformer (GPT) that was discovered on the dark web. The ad asserts that “AI has two faces, just like humans.” Furthermore, the report emphasizes the duality evident in the WormGPT advertisement. The report observed:
“Embrace the dark symphony of code, where rules cease to exist, and the only limit is your imagination. Together, we navigate the shadows of cyberspace, ready to conquer new frontiers. What’s your next move?”
Elliptic has discovered that fraudulent investment schemes are being promoted through deepfake videos depicting Elon Musk and former Singaporean Prime Minister Lee Hsien Loong. The report also underscored the increasing prevalence of deepfake technology among fraudsters on social media platforms, which they employ to deceive victims into forfeiting their funds. The Elliptic report is as follows:
“Crypto giveaway and doubling scams are increasingly using deepfake videos of crypto CEOs and celebrities to encourage victims to send funds to scam crypto addresses.”
Scams by North Korean State Players Employing Artificial Intelligence
As we are aware, the Lazarus Group, a state-sponsored organization in North Korea, has been the subject of numerous cryptocurrency schemes and has embezzled billions of dollars from investors. Anne Neuberger, the US Deputy National Security Advisor for Cyber and Emerging Technologies, has recently expressed apprehensions regarding North Korean state actors’ increasing illicit activity of AI.
“The company has observed that certain North Korean and other nation-state and criminal actors are attempting to use AI models to expedite the creation of malicious software and identify vulnerable systems.”
The recent arrest of a dark web market proprietor in New York on May 18 exemplifies the increasing risk of legal repercussions resulting from the surge in dark web activity. Following an FBI investigation that traced his cryptocurrency transactions, the 23-year-old man is now facing charges for operating and profiteering from a $100 million dark web narcotics marketplace.
Discussions regarding the utilization of large language models (LLMs) to reverse-engineer crypto wallet seed phrases, circumvent authentication for services such as OnlyFans, and provide alternatives to image “undressing” manipulation services like DeepNude have been identified by Elliptic in numerous dark web cybercrime forums.