News listChatGPT Images 2.0 Is Becoming a Market Fraud Tool with Deepfakes
BeInCrypto2026-05-09 18:15:14

ChatGPT Images 2.0 Is Becoming a Market Fraud Tool with Deepfakes

AI Impact AnalysisGrok analyzing...
📄Full Article· Automatically extracted by trafilatura2669 words
Deepfakes have shifted from a niche concern to a mass-market threat. May’s incidents show how consumer-grade tools now outpace any institutional response. The damage extends into crypto. Scammers leverage artificial intelligence (AI) to create impersonation scams. The Deepfake Economy Is Here, and Detection Is Losing In early May 2026, AI-generated content showed up across politics, entertainment, and crime, as documented by Resemble AI. FBI Director Kash Patel posted a video that appeared to use AI to generate shots nearly identical to those in the Beastie Boys’ “Sabotage” music video. Furthermore, an AI video of mayoral candidate Spencer Pratt drew 4.1 million views on X. Follow us on X to get the latest news as it happens These tools aren’t just being used for viral content. They are also fueling real financial harm. A Chicago man lost $69,000 to a scammer who flashed an AI-generated US Marshals badge on a video call. Meanwhile, the Atlantic’s Lila Shroff found that OpenAI’s ChatGPT Images 2.0 can generate fake IDs, prescriptions, receipts, bank alerts, and news screenshots. “All of this makes it even harder for banks, hospitals, government agencies, and the like to prevent fraud,” Shroff wrote. 404 Media exposed Haotian AI, a Chinese real-time deepfake software. Reporter Joseph Cox swapped faces on a live Teams call using this, proving the technology is functional, for sale, and already being used against real victims. “Three of this week’s stories, Haotian AI, the Meloni deepfake, and the Patel FBI video, come from completely different categories and geographies, but they share a structural condition: the tools used to produce the harm are consumer-grade, widely available, and improving faster than any institutional response. Haotian AI costs a few hundred dollars and works on Teams. ChatGPT Images 2.0 is a subscription product,” Resemble AI said. Crypto Also Bears the Cost Crypto has become a prime target for AI-driven deception. According to Chainalysis, fraudsters are now pairing deepfakes, face-swap apps, and large language models with classic romance and investment cons, and the math favors them. The average AI-assisted crypto scam nets roughly $3.2 million, about 4.5 times the haul of a conventional scheme. Several cases underline the threat. In August 2025, attackers stole $2 million by impersonating the founder of Plasma. BeInCrypto has also reported on North Korean operatives running deepfake video calls on Zoom. Together, these incidents mark AI-powered impersonation as one of the sector’s most pressing security risks. Subscribe to our YouTube channel to watch leaders and journalists provide expert insights
Data Status✓ Full text extractedRead Original (BeInCrypto)
🔍Historical Similar Events· Keyword + Asset Matching2 items
💡 Currently matching via keywords + symbols (MVP) · Will be upgraded to embedding semantic search later
Raw Information
ID:d42ed8f3ba
Source:BeInCrypto
Published:2026-05-09 18:15:14
Category:General · Export Category neutral
Symbols:Unspecified
Community Votes:+0 /0 · ⭐ 0 Important · 💬 0 Comments
ChatGPT Images 2.0 Is Becoming a Market Fraud Tool with Deepfakes | Feel.Trading