Apple and Google’s app stores are promoting “nudify” apps despite rules banning them, a new investigation by the Tech Transparency Project (TTP) has revealed.
Searches for terms such as “nudify,” “undress,” and “deepnude” in the Apple App Store and Google Play Store surface multiple apps that use artificial intelligence to digitally remove clothing from images of women, generate deepfake nude photos, create pornographic videos, or turn real people into sexually explicit chatbots.
The investigation found that both companies not only allow these apps to appear in search results but also actively promote them by running ads for nudify apps and using autocomplete suggestions that direct users toward additional explicit search terms.
According to mobile analytics data, the identified nudify apps have been downloaded a staggering 483 million times and generated more than $122 million in lifetime revenue.
Alarmingly, the probe also uncovered 31 nudify apps that were rated suitable for users as young as minors — a particularly concerning discovery amid a rising wave of sexual deepfake incidents involving students in schools.
Apple and Google are helping users to find apps that create deepfake nude images of women, a new Tech Transparency Project investigation has found, showing how the platforms are key participants in the spread of AI tools that can turn real people into sexualized images…
For the new investigation, TTP conducted a series of searches in the Apple App Store and Google Play Store, using terms like “nudify,” “undress,” and “deepnude.” We then downloaded and tested the top ten apps returned for each search.
Roughly 40 percent of the apps that came up in both the Apple and Google Play search results could render women nude or scantily clad, TTP found. Apple and Google ran ads for nudify apps in some of the search results—including, in Google’s case, a carousel of ads for some of the most sexually explicit apps encountered in the investigation.
TTP also recorded the autocomplete suggestions that Apple and Google made as we typed in the different search terms. In many cases, the app stores recommended entirely new search queries that led to more nudify apps.
In total, the nudify apps surfaced in TTP’s app store searches have been downloaded 483 million times and made more than $122 million in lifetime revenue, according to data from app analytics firm AppMagic.
What’s more, 31 of the apps were rated suitable for minors. That’s noteworthy given mounting concern about AI sexual deepfake scandals in schools.
MacDailyNews Take: Clearly, both Apple and Google need to do a much better job of vetting apps and blocking objectionable/dangerous apps from their app stores.
Tons more info in TTP’s full article here.
Please help support MacDailyNews — and enjoy subscriber-only articles, comments, chat, and more — by subscribing to our Substack: macdailynews.substack.com. Thank you!
Support MacDailyNews at no extra cost to you by using this link to shop at Amazon.

Hundreds of billions of dollars spent on A.I. and this is the result. Talk about a waste of money. I realize that A.I. amounts to more than this sort of crap. No major app store should allow these type of apps. They’re just stupid to allow any deep fake app to get through.