Romance Scammers Now Using Real-Time AI Deepfake Video Calls
Deepfake technology enables scammers to conduct convincing live video chats, defeating traditional verification.
The Federal Trade Commission has issued an alarming report revealing romance scammers have begun using real-time deepfake video technology to conduct convincing video calls with victims. This dangerous new technique completely defeats the common advice to "video chat to verify someone's identity." Victims report having multiple video conversations over weeks or months with their supposed romantic interest, only to eventually discover the person never existed. The AI technology can generate realistic facial expressions, lip-sync to the scammer's voice, and even mimic different accents. FTC recommendations include: reverse image searching all photos, being extremely wary of anyone claiming to be overseas military, oil rig workers, or doctors abroad, and never sending money to someone you haven't met multiple times in person. Romance scam losses hit a record $1.3 billion in 2024.