Scammers are using AI voice‑cloning to produce urgent voicemails that impersonate relatives and demand immediate transfers. The Guardian warns listeners to pause, verify via known contact methods, and adopt family codewords to prevent vishing attacks.

The Guardian reports a sharp rise in AI‑enabled voice‑cloning used to craft convincing voicemail appeals for emergency funds, often impersonating children or other relatives. Criminals can assemble a usable voice clone from only a few seconds of audio harvested from voicemail greetings, social media posts, or videos, then deliver persuasive messages that pressure victims into instant money transfers. The article outlines practical precautions: remove or replace personalized voicemail greetings with generic messages, establish family-specific codewords and verification routines, and always call back a saved number rather than replying to the message. It stresses that attackers combine social engineering with AI tools to escalate trust and urgency, creating vishing campaigns that are harder to detect. The piece also highlights the emotional impact on families and the need for technology firms and law enforcement to adapt detection and response measures. Simple behavioral defenses and verification practices are emphasized as the most effective immediate protection for potential victims.