McAfee research: Scammers use AI voice‑cloning to impersonate relatives and executives in vishing scams
McAfee research in mid‑November 2025 found scammers increasingly leverage AI voice‑cloning tools to produce short, convincing audio impersonations of relatives or executives to execute phone and voicemail frauds, often prompting victims to send untraceable payments. The report warns that three‑to‑ten seconds of audio can suffice to create an effective clone and that recovered incidents frequently involve requests for gift cards, cryptocurrency, or wire transfers.
Industry research published by McAfee in mid‑November 2025 documents a marked rise in the use of AI voice‑cloning technologies by fraudsters conducting vishing and voicemail scams. The analysis shows that modern cloning tools can generate plausibly convincing short audio samples from as little as three to ten seconds of source speech, enabling scammers to impersonate family members, company executives, or trusted contacts and to pressure victims into fast, irreversible payments. Typical fraudulent asks mirror longstanding trends—requests for gift cards, cryptocurrency transfers, or immediate bank wires—because those channels are hard to trace or reverse. McAfee’s report combines survey data, incident investigations, and technical testing to explain how easily available AI toolchains lower the bar for producing realistic voice fakes and how scammers integrate cloned audio into multi‑channel campaigns that include spoofed caller ID and phishing messages. The research recommends layered defenses: authentication policies that do not rely on voice alone, explicit organizational protocols for payment requests, public awareness campaigns, and faster reporting to law‑enforcement and critical infrastructure partners to disrupt criminals exploiting AI capabilities.