FBI warns of AI‑generated virtual kidnapping extortion using deepfakes and voice cloning
The FBI issued a public warning about extortionists using AI‑altered photos, deepfake video and, in some cases, voice cloning to create fake ‘proof of life’ for virtual kidnappings. The agency advised direct verification with the alleged victim and use of prearranged code words to thwart these urgent, ephemeral ransom demands.
In warnings publicized Dec. 5–6, 2025, the FBI alerted the public to a rise in virtual kidnapping scams that leverage AI‑generated images, video deepfakes, and voice cloning to fabricate evidence of abduction and pressure victims’ families into paying ransoms. Attackers typically send urgent text or social media messages, often with time‑limited media, demanding payment and threatening immediate harm. The bureau emphasized common tactics including the alteration of authentic photos, creation of synthetic video clips, and the use of cloned voices to impersonate victims; criminals exploit panic, limited verification time, and families’ emotional responses. To counter the threat, the FBI recommended direct voice or video contact with the alleged victim, use of prearranged family code words, and skepticism of time‑sensitive communications that discourage independent confirmation. The advisory also noted that attackers may use ephemeral messaging or burner accounts to evade tracing. The FBI’s guidance stresses reporting incidents promptly to local law enforcement and preserving all messages and media for investigators as part of broader efforts to identify and disrupt deepfake‑enabled extortion operations.
Related Scam Types
Related Articles
Hiya Report: 1 in 4 Americans Received AI Deepfake Voice Calls, Scammers Outpacing Carriers
Study finds deepfake-enabled fraud occurring on an 'industrial scale', AI Incident Database