Reporters and security experts warn that AI voice‑cloning is enabling highly convincing emergency voicemails that mimic children or relatives to pressure family members into sending money quickly. The piece advises pausing, calling the person back on a known number, or using prearranged code words and cautions against paying via gift cards, wire transfers, or crypto.

Journalists and cybersecurity specialists detail a rapid rise in so‑called “hey‑kid” and grandparent scams enabled by AI voice‑cloning, where fraudsters generate believable imitations from only a few seconds of audio. Attackers deploy these synthetic snippets in voicemails and live calls to create emotionally urgent narratives — a child stranded, a relative in hospital, or an account frozen — to extract money before victims can verify. The technology reduces the time and skill previously needed to impersonate a loved one and is being reported across the UK and US, prompting local news alerts and industry warnings. The story stresses simple verification steps: pause and call back on a number you know is genuine, use prearranged code words or phrases, and never comply with immediate payment demands that insist on gift cards, wire transfers, or cryptocurrency. It also highlights how criminal marketplaces and social platforms accelerate sharing of voice‑cloning tools and audio samples, urging consumer education and rapid reporting to authorities to limit financial and emotional harm.