AI-driven romance and pig-butchering scams surge around Valentine’s Day, experts warn
Reporting and expert analysis show a spike in romance and pig-butchering scams amplified by generative AI ahead of Valentine’s Day. Fraudsters deploy AI-generated photos, voice and video to build believable relationships and then pressure victims to send money or invest in fake crypto schemes, with FBI and industry data indicating rising losses and underreporting.
A surge in AI-enhanced romance scams and so-called pig-butchering investment fraud was documented around Valentine’s Day, with reporting and security experts pointing to generative models as the force multiplying scale and believability. Scammers use AI to create realistic profile photos, synthesized voices and deepfake video interactions to cultivate trust, emotionally manipulate victims, and shepherd them toward fraudulent investment platforms—often crypto-related—where funds are laundered through shell companies and mixing services. Industry and FBI data cited in reporting show growing monetary losses and a persistent gap in reporting rates, complicating law-enforcement response. Analysts warn that AI lowers production costs for convincing personas and enables simultaneous, tailored campaigns that can deceive technologically savvy targets. The trend exacerbates traditional romance-scam dynamics by shortening the time needed to achieve emotional rapport and by producing near-immediate fabricated validation signals such as fake KYC screenshots or counterfeit account ledgers. Experts urged heightened public awareness, stronger platform detection, and cross-border cooperation to trace funds and disrupt automated campaign infrastructure.