Apple is cracking down on a category of AI image generation apps that “advertised the ability to create nonconsensual nude images.” According to a new report from 404 Media, Apple has removed multiple AI apps from the App Store that claimed they could “create nonconsensual nude images.”
On Monday, the site published a report exploring how companies were using Instagram advertising to promote apps that could “undress any girl for free.” Some of these Instagram ads took users directly to Apple’s Store for an app that was described there as an “art generator.”
Today’s report says that Apple did not initially respond to 404 Media’s request for comment on Monday. The company did, however, reach out directly after the initial story was published to ask for more information. When provided with direct links to the specific ads and App Store pages, Apple proceeded to remove those apps from the App Store.