Police in Indore opened an investigation after a caller using an AI‑generated voice impersonated a relative and coerced a resident into transferring about Rs 1.83 lakh. Authorities warned this is part of a growing trend of voice‑clone scams exploiting emotional urgency.

A recent case from Indore highlights the rising threat of AI‑driven voice‑cloning frauds: a resident received a phone call in which the perpetrator used an AI‑generated voice that closely mimicked a family member to create urgency and demand immediate payment. The victim was persuaded to transfer about Rs 1.83 lakh before realizing the call was fraudulent; the city’s cyber cell has opened an inquiry. These so‑called "hi‑mom" or "voice‑clone" scams exploit realistic synthetic audio, social engineering, and publicly available personal details to manufacture credible emergencies. Investigators warn that such frauds can be followed by requests to move funds through UPI, bank transfer or gift cards to evade traceability. Authorities advise the public to independently verify any urgent money requests by calling known numbers, enabling call‑screening features, and refusing payment until identity is confirmed through multiple channels. Banks and payment platforms should also be alerted immediately if funds are transferred under duress, as rapid reporting improves the chance of recovery.