Malaysian police warn of silent calls that could harvest voice samples for AI voice‑cloning fraud
Malaysian police cautioned the public about unexplained 'silent' inbound calls that may be used to collect voice samples for later AI voice‑cloning scams, urging suspicion and reporting of unknown calls. Authorities noted no confirmed complaints yet but warned vigilance as deepfake and voice‑cloning attacks rise globally.
Malaysian law enforcement issued an advisory about a spate of unexplained inbound 'silent' calls, warning that scammers may be testing lines or capturing brief voice samples to later construct AI‑generated voice clones for fraudulent schemes. Police urged citizens to treat unknown calls with heightened caution, avoid speaking when unsolicited numbers call, and report suspicious calls to authorities and telecom providers so investigators can analyze call patterns and trace potential operators. While officials said they had not yet received formal complaints directly linking the silent calls to confirmed voice‑harvesting misuse, they emphasized the rapid improvement of generative audio tools and the potential for malicious actors to use cloned voices to impersonate family members, employers, or financial institution representatives. The advisory included practical steps: do not disclose sensitive personal or financial information over unsolicited calls, enable call‑screening features where available, and notify banks immediately if an impersonation attempt leads to a suspected compromise. Local reporting of a separate Hi‑Line Today case involving an AI‑manipulated voice used to extract banking details from a small business illustrates the growing community‑level risk and the need for proactive awareness and law enforcement coordination.