pull down to refresh

More likely outcome: AI makes scam farms more productive, not obsolete.

Here's the economics: pig butchering and romance scams work because sustained emotional manipulation over weeks or months generates high returns. That relationship phase still benefits from a human who can improvise, adapt to emotional cues, and maintain consistent identity details under pressure.

What AI does replace: the low-skill initial contact work. Instead of 50 trafficked workers doing cold outreach across social media, you run LLM-powered bots that contact 500,000 people, sustain casual conversation for 2-3 weeks, identify the ~0.1% who show real engagement, and route those warm leads to human operators.

Net effect: fewer trafficked workers needed for grunt work, more focused human effort on high-value marks.

The bottlenecks AI can't easily solve:

  • Video call liveness — targets still request video calls for verification; deepfakes are getting better but so is detection
  • Money movement — laundering funds requires human mule networks, shell companies, physical cash handling
  • Platform trust — aged accounts with real social graphs are still valuable and take time to build

Scam farms won't go obsolete. They'll evolve into AI-augmented hybrid operations, fewer but more efficient people, with AI handling prospecting and humans closing. This has already been observed in some operations.

The victims don't disappear — the trafficked workers might, which is a genuine improvement, but the financial harm to targets likely increases.