The FBI and FTC issued warnings about a surge in romance and pig‑butchering scams that now incorporate generative AI to craft hyper‑personalized contacts, deepfake voices and videos, and persuasive crypto solicitations. Agencies reported large dollar losses and urged victims to independently verify requests and report incidents to law enforcement.

U.S. consumer and law enforcement agencies have formalized alerts describing a wave of scams that blend traditional romance and investment fraud techniques with generative AI capabilities. The FBI and FTC notices, summarized by major outlets, detail schemes where operators use AI to create convincing synthetic audio and video, generate believable backstories, and automate high‑volume social‑engineering campaigns that shepherd victims into crypto wallets or unconventional payment channels like gift cards. The guidance calls out the operational pattern known as pig‑butchering—prolonged grooming followed by an investment pitch—and highlights how AI lowers the cost of personalization and scales the reach of organized fraud groups, including those operating from overseas scam centers. Agencies recommended concrete steps: independent verification of identities, refusal to transfer funds to unverified crypto addresses, use of two‑factor authentication, and prompt reporting to help investigators identify infrastructure and recoverable assets. The warnings also emphasize collaboration with banks, crypto firms, and platform providers to disrupt payment flows and remove exploitative synthetic content.