Lawrence, Kansas AI voice-cloning scam dupes woman into believing mother kidnapped
Lawrence, Kansas police reported a case where a woman received an AI‑generated voice call impersonating her mother, prompting a belief the mother had been kidnapped until investigators identified the audio as synthetic. Officials warned that family-emergency voice‑cloning scams are rising, urging verification before sending money or taking emergency actions.
Police in Lawrence, Kansas described an alarming incident in which an elderly woman believed her mother had been abducted after receiving a convincing phone call that used an AI‑cloned voice. Investigators analyzed the audio and determined it was synthetic, part of a growing trend in which scammers use voice cloning to create emotionally persuasive scams that demand urgent payments or actions. Law enforcement emphasized the speed and realism of modern voice synthesis tools, noting that these scams often bypass typical caller‑ID checks and exploit familial trust. Officers urged immediate verification steps—contacting other family members, calling known numbers, or checking physical security—before complying with demands. The case highlights gaps in training and legal frameworks that struggle to keep pace with rapid advances in generative audio technology, complicating prosecution and prevention. Local authorities are coordinating with state and federal partners to track similar incidents and advise the public on technical and behavioral mitigations, including recording suspicious calls, preserving evidence, and reporting scams promptly to aid investigations and disrupt malicious operators.
Related Articles
Hiya Report: 1 in 4 Americans Received AI Deepfake Voice Calls, Scammers Outpacing Carriers
Study finds deepfake-enabled fraud occurring on an 'industrial scale', AI Incident Database