Apple on Thursday said it will implement a system that checks photos on iPhones in the United States before they are uploaded to its iCloud storage services to ensure the upload does not match known images of child sexual abuse, but some privacy advocates said the system could open the door to monitoring of political speech or other content on iPhones. The Electronic Frontier Foundation (EFF) describes it thusly: “Apple is planning to build a backdoor into its data storage system and its messaging system.”
Child exploitation is a serious problem, and Apple isn’t the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy. Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.
To say that we are disappointed by Apple’s plans is an understatement. Apple has historically been a champion of end-to-end encryption, for all of the same reasons that EFF has articulated time and time again. Apple’s compromise on end-to-end encryption may appease government agencies in the U.S. and abroad, but it is a shocking about-face for users who have relied on the company’s leadership in privacy and security.
We’ve said it before, and we’ll say it again now: it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.
MacDailyNews Take: Apple must have been placed in an untenable situation to introduce this backdoor, destroying their vaunted claims to protecting privacy, or Tim Cook has completely lost the plot.
There have been people that suggest that we should have a backdoor. But the reality is if you put a backdoor in, that backdoor’s for everybody, for good guys and bad guys… I think everybody’s coming around also to recognizing that any backdoor means a backdoor for bad guys as well as good guys. And so a backdoor is a nonstarter. It means we are all not safe… I don’t support a backdoor for any government, ever.
We do think that people want us to help them keep their lives private. We see that privacy is a fundamental human right that people have. We are going to do everything that we can to help maintain that trust. — Apple CEO Tim Cook, October 1, 2015
Things that make you go hmmm. When somebody spends years insisting, correctly, that water is wet and then, all of a sudden, claims that, no, water is actually dry, then clearly something is rotten in the state of Denmark.
Something caused Tim Cook to kowtow to this backdoor. What was it?
Again, encryption is binary; it’s either on or off.
You cannot have both. You either have privacy via full encryption or you don’t by forcing back doors upon Apple. It’s all or nothing.
Without strong encryption (meaning no back doors), U.S. companies’ tech products would be eschewed around the world. — MacDailyNews, February 2, 2017
See also: How to jailbreak your iPhone
All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change. Take the example of India, where recently passed rules include dangerous requirements for platforms to identify the origins of messages and pre-screen content. New laws in Ethiopia requiring content takedowns of “misinformation” in 24 hours may apply to messaging services. And many other countries—often those with authoritarian governments—have passed similar laws. Apple’s changes would enable such screening, takedown, and reporting in its end-to-end messaging. The abuse cases are easy to imagine: governments that outlaw homosexuality might require the classifier to be trained to restrict apparent LGBTQ+ content, or an authoritarian regime might demand the classifier be able to spot popular satirical images or protest flyers.
We’ve already seen this mission creep in action. One of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed to create a database of “terrorist” content that companies can contribute to and access for the purpose of banning such content. The database, managed by the Global Internet Forum to Counter Terrorism (GIFCT), is troublingly without external oversight, despite calls from civil society. While it’s therefore impossible to know whether the database has overreached, we do know that platforms regularly flag critical content as “terrorism,” including documentation of violence and repression, counterspeech, art, and satire…
The bottom line here is that whatever privacy and security aspects are in the technical details, all photos uploaded to iCloud will be scanned.
Make no mistake: this is a decrease in privacy for all iCloud Photos users, not an improvement.
Currently, although Apple holds the keys to view Photos stored in iCloud Photos, it does not scan these images. Civil liberties organizations have asked the company to remove its ability to do so. But Apple is choosing the opposite approach and giving itself more knowledge of users’ content.
MacDailyNews Take: Some report that the FBI or other governmental apparatuses placed pressure on Apple to never end-to-end encrypt iCloud backup, as a company that’s actually concerned with privacy and security would have done years ago. Clearly, with the continued existence of unencrypted iCloud backups and this new backdoor intrusion into user’s devices, Apple’s privacy claims ring hollow and now amount to nothing more than bullshit marketing.
We expected Apple to be better. Apple failed.
The idea that Apple is a “privacy” company has bought them a lot of good press. But it’s important to remember that this is the same company that won’t encrypt your iCloud backups because the FBI put pressure on them. https://t.co/tylofPfV13
— Matthew Green (@matthew_d_green) August 5, 2021
It is also important to note that Apple has chosen to use the notoriously difficult-to-audit technology of machine learning classifiers to determine what constitutes a sexually explicit image. We know from years of documentation and research that machine-learning technologies, used without human oversight, have a habit of wrongfully classifying content, including supposedly “sexually explicit” content. When blogging platform Tumblr instituted a filter for sexual content in 2018, it famously caught all sorts of other imagery in the net, including pictures of Pomeranian puppies, selfies of fully-clothed individuals, and more. Facebook’s attempts to police nudity have resulted in the removal of pictures of famous statues such as Copenhagen’s Little Mermaid. These filters have a history of chilling expression, and there’s plenty of reason to believe that Apple’s will do the same.
Since the detection of a “sexually explicit image” will be using on-device machine learning to scan the contents of messages, Apple will no longer be able to honestly call iMessage “end-to-end encrypted.” Apple and its proponents may argue that scanning before or after a message is encrypted or decrypted keeps the “end-to-end” promise intact, but that would be semantic maneuvering to cover up a tectonic shift in the company’s stance toward strong encryption…
People have the right to communicate privately without backdoors or censorship, including when those people are minors. Apple should make the right decision: keep these backdoors off of users’ devices.
MacDailyNews Take: Because Apple has decided to destroy years of claims of privacy protection from the CEO on down, we have to wonder if Apple was promised something – implicitly or explicitly – in return for this horrible and stupid decision? Would the U.S. and other governments go easier on the company in antitrust proceedings or taxes or something? Apple’s decision is so flummoxing, why else would the company which was so adamantly opposed to adding backdoors into their devices make such a shocking privacy and security about-face?
Again, as it’s likely meant to be, this sounds wonderful at first glance (everyone’s for detecting and rooting out purveyors of child pornography) and horrible once you think about it for more than a second (massive, awful potential for misuse).
It’s a huge can of worms. It’s a backdoor, plain and simple, and it neatly negates Apple’s voluminous claims of protecting users’ privacy.
All of the marketing money, TV ads, and solemn privacy pronouncements from Tim Cook over the last several years can’t stand up to the simple fact that now, in a nutshell, Apple will scan Photos and Messages.
It doesn’t matter what they’re scanning for, because if they can scan for one thing, they can scan for anything.
Smart people will begin looking for alternatives to Apple’s iCloud Photos and Photos and Messages apps.
To turn off your iCloud Photos everywhere, follow these steps:
• On your iPhone, iPad, or iPod touch, go to Settings > [your name] > iCloud > Manage Storage > Photos, then tap Disable & Delete.
• On your Mac, go to Apple menu > System Preferences, then click Apple ID. Choose iCloud, then click Manage. Select Photos, then click Turn Off and Delete.
If you change your mind, follow the steps above on your device then select Undo Delete.
Photos and videos are stored in your account for 30 days. To download your photos and videos on your iOS device, go to Settings > [your name] > iCloud > Photos and select Download and Keep Originals. On your Mac, open Photos, choose Photos > Preferences, then click iCloud, then select Download Originals to this Mac. You can also select the photos and videos that you want to download from iCloud.com.
We hope Apple’s scanning system is never used for anything else besides CSAM, but with the pressures that governments can wield and the amount with which Apple is beholden to China, we highly doubt it.
Whether they turn out to be right or wrong on that point hardly matters. This will break the dam — governments will demand it from everyone.
And by the time we find out it was a mistake, it will be way too late.
— Matthew Green (@matthew_d_green) August 5, 2021