The UK government is set to urge Apple and Google to embed advanced nudity-detection software directly into device operating systems. This proposed measure, detailed in a forthcoming Home Office strategy, would automatically block the capture, sharing, or viewing of images depicting genitalia by default, with adults required to undergo age verification—such as biometric scans or ID checks—to disable the restrictions. While stopping short of mandating these controls, the initiative highlights a shift toward proactive, device-level safeguards over broader social media bans for minors.
Chris Smyth and Anna Gross for Financial Times:
Officials have explored making such controls a mandatory requirement for devices sold in the UK but have decided against such an approach for now, the people added.
While Apple and Google’s Android mobile operating system have developed sensitive content warnings for younger users, these can be overridden by entering a passcode.
The Home Office wants to see operating systems that prevent any nudity being displayed on screen unless the user has verified they are an adult through methods such as biometric checks or official ID.
While the policy has initially focused on mobile devices, officials say the same models could also be applied to desktops… However, the proposals are likely to face objections over privacy and civil liberties, as well as questions about how effective such measures could be.
MacDailyNews Take: Apple already offers “Communication Safety” tools for parents. If your child receives or attempts to send photos or videos that might contain nudity, Communication Safety warns them, gives them options to stay safe, and provides helpful resources.
Communication Safety uses on-device machine learning to analyze photo and video attachments and determine if a photo or video appears to contain nudity. Because the photos and videos are analyzed on your child’s device, Apple doesn’t receive an indication that nudity was detected and doesn’t get access to the photos or videos.
Communication Safety detects nude photos and videos in these apps and services:
• On iPhone or iPad: Messages, AirDrop, Contact Posters in the Phone app and the Contacts app, FaceTime calls and video messages, shared photo albums, and when your child selects a photo or video to share in some third-party apps.
• On Mac: Messages, Contact Posters in the Phone app and the Contacts app, shared photo albums, and when your child selects a photo or video to share in some third-party apps.
• On Apple Watch: Messages, Contact Posters in the Phone app and the Contacts app, and FaceTime video messages.
• On Apple Vision Pro: Messages, AirDrop, and when your child selects a photo or video to share in some third-party apps.
Learn more about Apple’s “Communication Safety” tools here.
Please help support MacDailyNews — and enjoy subscriber-only articles, comments, chat, and more — by subscribing to our Substack: macdailynews.substack.com. Thank you!
Support MacDailyNews at no extra cost to you by using this link to shop at Amazon.

As a parent of a teen girl… who is Apple savvy… I use parental controls. Have for years. But, I didn’t know there a Communication Safety feature. My guess is the UK doesn’t know either.
Apple spends so much time telling is trivia information like a phone weighing .03 ounces less, or that we can change font sizes on the Lock Screen. They seem to forget that if they don’t tells about substantive changes, we’ll never know. There is simply no way to learn about and understand every new feature each year. Especially when we have to sift through Apples cute little nicknames for every thing they invent.
and cute little features that are nothing but presumptuous (as if I really want/need it any way).
The UK nominated a Russian Oligarch to the House of Lords, whose father was a former KGB agent — and their main concern is nudity on the Internet?? . . . time for UK voters to re-read that novel by a local author – “1984”
The UK’s push for OS-level nudity filters raises a complex debate between child safety and digital privacy. From my perspective, while proactive measures like Apple’s on-device “Communication Safety” are promising, requiring biometric or ID verification could set a concerning precedent for surveillance. The challenge will be balancing effective protection for minors with respect for adults’ privacy and avoiding overreach—something regulators and tech companies must navigate carefully.
First they came for the children –
Best comeback I have seen in years!
Quite pointless and authoritarian. Numerous studies have shown that nudity does not hurt children. It’s in fact quite educational, natural and healthy.