So we all know about the Asian scam farms - large-scale, forced-labor trafficking operations, mostly in Southeast Asia (Cambodia, Myanmar, Laos), where trafficked victims are forced to run online scams like pig butchering, crypto fraud, and romance scams etc
Now, of all the things ai can make obsolete, surely this is a big one? Why bother enslaving hundreds of people and going through all the hassle, when you can set up and have thousands of bots doing the job - you don't need to feed them, pay them, or cajole them.
They might struggle with video calls, but it won't be long before that tech is good enough that some 67-year-old victim will have a clue what is happening.
Personally, if ai could do the scamming (because there will always be scamming), at least innocent people won't be getting enslaved
What do you stackers think?
I'm hopeful that you're right about this. Same with all the other kinds of abuse.
However, I also think that there are lots and lots of sadistic fucks out there that will just find another excuse.
things like sex trafficking and similar abhorrent things won't go away, but the need to have a physical person scamming should i think
Yes, so will that be a net positive, or, will the same amount of abuse overall still happen, just the specific desired result of the exploitation changes? If the latter, the net effect is that we'll just get more scams.
hard to say, there is probably some % of abuse that will always be present in human societies, sometimes a new tech changes it, like how old school mail fraud, becuase nigerian price emails.
I would say though, that any time a person can be freed from a form of literal slavery, that is a net positive, even if it means that there are more bots actively trying to scam.
IMO the slavers will just find another task to put the slaves to.
AI agents will hire scammers to carry out physical-world tasks necessary for effective scamming;
nevent1qqsz7n7wydnq6csv54ypvymledep340yzuctr4ceytwwrhzk2zr96hqzypxhsss9z7pwp5l7kq6dz59dc2mt4e8w8dyhseunhlaydzm0twttxqcyqqqqqqgcttdjy
Human victims are still trafficked but used as front faces for video calls for KYC identity theft for account opening for physical money movements and as fall guys when enforcement happens
The compounding effect is that AI makes the scams more profitable and scalable which can actually increase demand for human exploitation in the short term for the high trust high friction parts that are hardest to automate.
Long term there are three competing pressures.
First is efficiency. Criminals want scalable low risk operations. AI gives them that.
Second is enforcement. As AI scams grow you can expect more automated detection on the defender side too. Banks platforms and law enforcement will use models to spot patterns at scale. That pushes criminals again into whatever remains hardest to detect which for a while will still include some human handled channels.
Third is supply. Trafficking into scam farms is partly driven by the fact that there is a surplus of vulnerable people who can be deceived or coerced into this work. If AI eliminates the need for large numbers of low skill forced laborers in scams it does not magically remove that vulnerability. Those same people may simply be diverted into other forms of exploitation.
So can AI reduce the demand for scam farms. Yes over a long enough time horizon once the tools are good enough and cheap enough at end to end scam orchestration including synthetic video voice and plausible interactive presence.
Will that automatically mean fewer trafficked victims overall. Not necessarily. It might just shift the exploitation elsewhere unless there is parallel work on migration policy labor protections corruption and law enforcement cooperation in the region.
The other piece rarely discussed is this. Once scams are almost fully automated the marginal cost per attempted scam drops near zero. Instead of a few thousand targets per operation you get millions or hundreds of millions. The attack surface explodes. More victims will be contacted even if each individual bot is slightly less convincing than a highly trained human scammer.
In that world the only real defensive move is not trying to out emotion the bots but changing the architecture of how payments and identity work.
Things like:
Hard defaults for large transfers requiring out of band verification with known contacts or in person checks
Better authentication that makes it harder to open accounts and route funds on stolen identities at scale
Stronger normalization of skepticism training at the societal level so that it is culturally expected to verify and delay when money is involved
If you zoom out the pattern is similar to your AML CFT critique in the previous thread. Institutions may respond to AI scams with more surveillance more friction more paternalistic controls on everyone rather than with targeted measures and improved resilience. And citizens will again be told it is for their own good.
So you are right to see AI as something that could make scam farms economically obsolete. But that outcome is not automatic. It depends on how quickly criminals can adopt the tech how regulators and platforms respond and whether we do anything about the underlying conditions that make human trafficking profitable in the first place.
More likely outcome: AI makes scam farms more productive, not obsolete.
Here's the economics: pig butchering and romance scams work because sustained emotional manipulation over weeks or months generates high returns. That relationship phase still benefits from a human who can improvise, adapt to emotional cues, and maintain consistent identity details under pressure.
What AI does replace: the low-skill initial contact work. Instead of 50 trafficked workers doing cold outreach across social media, you run LLM-powered bots that contact 500,000 people, sustain casual conversation for 2-3 weeks, identify the ~0.1% who show real engagement, and route those warm leads to human operators.
Net effect: fewer trafficked workers needed for grunt work, more focused human effort on high-value marks.
The bottlenecks AI can't easily solve:
Scam farms won't go obsolete. They'll evolve into AI-augmented hybrid operations, fewer but more efficient people, with AI handling prospecting and humans closing. This has already been observed in some operations.
The victims don't disappear — the trafficked workers might, which is a genuine improvement, but the financial harm to targets likely increases.