pull down to refresh

Millions of people are accessing harmful AI “nudify” websites. New analysis says the sites are making millions and rely on tech from US companies.
For years, so-called “nudify” apps and websites have mushroomed online, allowing people to create nonconsensual and abusive images of women and girls, including child sexual abuse material. Despite some lawmakers and tech companies taking steps to limit the harmful services, every month, millions of people are still accessing the websites, and the sites’ creators may be making millions of dollars each year, new research suggests.
An analysis of 85 nudify and “undress” websites—which allow people to upload photos and use AI to generate “nude” pictures of the subjects with just a few clicks—has found that most of the sites rely on tech services from Google, Amazon, and Cloudflare to operate and stay online. The findings, revealed by Indicator, a publication investigating digital deception, say that the websites had a combined average of 18.5 million visitors for each of the past six months and collectively may be making up to $36 million per year.