Kuala Lumpur police warn of 'silent call' AI voice‑cloning scam using 3–5 second clips
Law enforcement in Kuala Lumpur warned of a 'silent call' scam in which fraudsters harvest brief voice samples and use AI voice‑cloning to impersonate relatives and demand urgent transfers. Police advised not to speak first to unknown callers, verify requests through separate channels, and limit personal audio shared online.
On December 12, 2025, Kuala Lumpur police issued an alert about an emerging 'silent call' scam that weaponizes AI voice‑cloning. Investigators described a pattern where fraudsters solicit or scrape short audio snippets—often just three to five seconds—and feed them into voice‑synthesis tools to create convincing impostor calls. Scammers then place urgent‑sounding calls to relatives or associates, impersonating a trusted voice and pressuring victims to immediately transfer funds or provide payment information. Authorities warned the approach reduces the need for extensive audio and can rapidly scale via social media or voicemail harvesting. Recommended precautions include not speaking first to unknown callers, asking questions only the real contact could answer, verifying requests through alternate known channels, using call‑blocking and spam filters, and minimizing publicly available voice or personal content online. The advisory emphasizes rapid public education to blunt a technique that exploits advances in generative audio models.
Related Scam Types
Related Articles
Hiya Report: 1 in 4 Americans Received AI Deepfake Voice Calls, Scammers Outpacing Carriers
Study finds deepfake-enabled fraud occurring on an 'industrial scale', AI Incident Database