The FBI warned criminals are increasingly using AI voice cloning, deepfakes and altered images to carry out virtual kidnapping extortion schemes that pressure victims to pay quickly. Authorities advise verifying a loved one directly, using family safe words, and reporting incidents to law enforcement and the IC3 portal.

The FBI issued a nationwide advisory describing a surge in AI-assisted virtual kidnapping scams in which attackers use voice cloning, deepfake video and AI-altered photos to create convincing but fraudulent evidence of a loved one being held for ransom. Scammers typically contact victims and demand immediate payment, often in cryptocurrency or untraceable methods, while leveraging synthetic audio or visual clips to simulate panic or threats. The bureau noted that current AI-generated materials often contain detectable inconsistencies and advised practical defenses: contact the purported victim directly using known numbers, establish and use family safe words, limit sharing of private media online, and report incidents to local law enforcement and the FBI IC3 complaint center. The advisory highlights the speed and emotional pressure these scams exploit, urging institutions and consumers to implement verification protocols and for service providers to monitor for abusive accounts and manipulated media. The FBI emphasized coordination with victims and public awareness to reduce successful extortion outcomes and improve investigative leads.