EFF: Apple’s photos scanning backdoor is a shocking privacy and security about-face

Apple on Thursday said it will implement a system that checks photos on iPhones in the United States before they are uploaded to its iCloud storage services to ensure the upload does not match known images of child sexual abuse, but some privacy advocates said the system could open the door to monitoring of political speech or other content on iPhones. The Electronic Frontier Foundation (EFF) describes it thusly: “Apple is planning to build a backdoor into its data storage system and its messaging system.”

EFF: Apple's photo scanning backdoor is a shocking privacy and security about-face

India Mckinney And Erica Portnoy for EFF:

Child exploitation is a serious problem, and Apple isn’t the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy. Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.

To say that we are disappointed by Apple’s plans is an understatement. Apple has historically been a champion of end-to-end encryption, for all of the same reasons that EFF has articulated time and time again. Apple’s compromise on end-to-end encryption may appease government agencies in the U.S. and abroad, but it is a shocking about-face for users who have relied on the company’s leadership in privacy and security.

We’ve said it before, and we’ll say it again now: it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.

MacDailyNews Take: Apple must have been placed in an untenable situation to introduce this backdoor, destroying their vaunted claims to protecting privacy, or Tim Cook has completely lost the plot.

There have been people that suggest that we should have a backdoor. But the reality is if you put a backdoor in, that backdoor’s for everybody, for good guys and bad guys… I think everybody’s coming around also to recognizing that any backdoor means a backdoor for bad guys as well as good guys. And so a backdoor is a nonstarter. It means we are all not safe… I don’t support a backdoor for any government, ever.

We do think that people want us to help them keep their lives private. We see that privacy is a fundamental human right that people have. We are going to do everything that we can to help maintain that trust. — Apple CEO Tim Cook, October 1, 2015

Things that make you go hmmm. When somebody spends years insisting, correctly, that water is wet and then, all of a sudden, claims that, no, water is actually dry, then clearly something is rotten in the state of Denmark.

Something caused Tim Cook to kowtow to this backdoor. What was it?

Again, encryption is binary; it’s either on or off.

You cannot have both. You either have privacy via full encryption or you don’t by forcing back doors upon Apple. It’s all or nothing.

Without strong encryption (meaning no back doors), U.S. companies’ tech products would be eschewed around the world.MacDailyNews, February 2, 2017

See also: How to jailbreak your iPhone

Mckinney and Portnoy continue for EFF:

All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change. Take the example of India, where recently passed rules include dangerous requirements for platforms to identify the origins of messages and pre-screen content. New laws in Ethiopia requiring content takedowns of “misinformation” in 24 hours may apply to messaging services. And many other countries—often those with authoritarian governments—have passed similar laws. Apple’s changes would enable such screening, takedown, and reporting in its end-to-end messaging. The abuse cases are easy to imagine: governments that outlaw homosexuality might require the classifier to be trained to restrict apparent LGBTQ+ content, or an authoritarian regime might demand the classifier be able to spot popular satirical images or protest flyers.

We’ve already seen this mission creep in action. One of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed to create a database of “terrorist” content that companies can contribute to and access for the purpose of banning such content. The database, managed by the Global Internet Forum to Counter Terrorism (GIFCT), is troublingly without external oversight, despite calls from civil society. While it’s therefore impossible to know whether the database has overreached, we do know that platforms regularly flag critical content as “terrorism,” including documentation of violence and repression, counterspeech, art, and satire…

The bottom line here is that whatever privacy and security aspects are in the technical details, all photos uploaded to iCloud will be scanned.

Make no mistake: this is a decrease in privacy for all iCloud Photos users, not an improvement.

Currently, although Apple holds the keys to view Photos stored in iCloud Photos, it does not scan these images. Civil liberties organizations have asked the company to remove its ability to do so. But Apple is choosing the opposite approach and giving itself more knowledge of users’ content.

MacDailyNews Take: Some report that the FBI or other governmental apparatuses placed pressure on Apple to never end-to-end encrypt iCloud backup, as a company that’s actually concerned with privacy and security would have done years ago. Clearly, with the continued existence of unencrypted iCloud backups and this new backdoor intrusion into user’s devices, Apple’s privacy claims ring hollow and now amount to nothing more than bullshit marketing.

We expected Apple to be better. Apple failed.

Mckinney and Portnoy continue for EFF:

It is also important to note that Apple has chosen to use the notoriously difficult-to-audit technology of machine learning classifiers to determine what constitutes a sexually explicit image. We know from years of documentation and research that machine-learning technologies, used without human oversight, have a habit of wrongfully classifying content, including supposedly “sexually explicit” content. When blogging platform Tumblr instituted a filter for sexual content in 2018, it famously caught all sorts of other imagery in the net, including pictures of Pomeranian puppies, selfies of fully-clothed individuals, and more. Facebook’s attempts to police nudity have resulted in the removal of pictures of famous statues such as Copenhagen’s Little Mermaid. These filters have a history of chilling expression, and there’s plenty of reason to believe that Apple’s will do the same.

Since the detection of a “sexually explicit image” will be using on-device machine learning to scan the contents of messages, Apple will no longer be able to honestly call iMessage “end-to-end encrypted.” Apple and its proponents may argue that scanning before or after a message is encrypted or decrypted keeps the “end-to-end” promise intact, but that would be semantic maneuvering to cover up a tectonic shift in the company’s stance toward strong encryption…

People have the right to communicate privately without backdoors or censorship, including when those people are minors. Apple should make the right decision: keep these backdoors off of users’ devices.

MacDailyNews Take: Because Apple has decided to destroy years of claims of privacy protection from the CEO on down, we have to wonder if Apple was promised something – implicitly or explicitly – in return for this horrible and stupid decision? Would the U.S. and other governments go easier on the company in antitrust proceedings or taxes or something? Apple’s decision is so flummoxing, why else would the company which was so adamantly opposed to adding backdoors into their devices make such a shocking privacy and security about-face?

Again, as it’s likely meant to be, this sounds wonderful at first glance (everyone’s for detecting and rooting out purveyors of child pornography) and horrible once you think about it for more than a second (massive, awful potential for misuse).

It’s a huge can of worms. It’s a backdoor, plain and simple, and it neatly negates Apple’s voluminous claims of protecting users’ privacy.

All of the marketing money, TV ads, and solemn privacy pronouncements from Tim Cook over the last several years can’t stand up to the simple fact that now, in a nutshell, Apple will scan Photos and Messages.

It doesn’t matter what they’re scanning for, because if they can scan for one thing, they can scan for anything.

Smart people will begin looking for alternatives to Apple’s iCloud Photos and Photos and Messages apps.

To turn off your iCloud Photos everywhere, follow these steps:

• On your iPhone, iPad, or iPod touch, go to Settings > [your name] > iCloud > Manage Storage > Photos, then tap Disable & Delete.

• On your Mac, go to Apple menu > System Preferences, then click Apple ID. Choose iCloud, then click Manage. Select Photos, then click Turn Off and Delete.

If you change your mind, follow the steps above on your device then select Undo Delete.

Photos and videos are stored in your account for 30 days. To download your photos and videos on your iOS device, go to Settings > [your name] > iCloud > Photos and select Download and Keep Originals. On your Mac, open Photos, choose Photos > Preferences, then click iCloud, then select Download Originals to this Mac. You can also select the photos and videos that you want to download from iCloud.com.

We hope Apple’s scanning system is never used for anything else besides CSAM, but with the pressures that governments can wield and the amount with which Apple is beholden to China, we highly doubt it.

33 Comments

  1. I said it yesterday and believe it even more so today:

    Tim Cook is either a fool or evil.

    I think it’s the former, but I haven’t been fully dissuaded of the latter.

    It starts with The Children™.

    It ends with:

    Freedom is never more than one generation away from extinction. We didn’t pass it to our children in the bloodstream. It must be fought for, protected, and handed on for them to do the same, or one day we will spend our sunset years telling our children and our children’s children what it was once like in the United States where men were free. – Ronald Reagan, March 30, 1961

  2. FBI Director Chris Wray: “Oh, and while you’re at, could you scan for MAGA hats and TRUMP flags too?” “Also, we need access to pics sent with iMessage too just to make sure. We promise not to scan any of the texts. Totally promise.”

    1. Yes, God forbid anybody would want America to be great, with smaller government, personal freedom, high employment, and low taxes!

      That doesn’t jibe with globalist establishment leftists who claim that COVID is a greater scourge than the bubonic plague while throwing open the border to a flood of COVID-infected illegal aliens and then shipping them around the country.

      To leftist Democrat authoritarians, Real Americans must be persecuted. How long until plays of “God Bless the U.S.A.” by Lee Greenwood is scanned and flagged in people’s Apple Music and cross-matched with Apple Maps Look Around (Street View) images containing certain political yard signs?

      Tim Cook needs to go. He’s a cancer on Apple. Always has been.

  3. LOL. Paranoid much? The same articles were written about having fingerprint scanners on phones, and facial recognition. Not hearing those anymore. If you actually see what Apple is talking about, it is within the phone, under parental protocols, checking for nudity in children’s message photos and reporting with choices to allow or disallow the photos.

    Of course, click baiters know that the tinfoil hats will see a vast conspiracy, as usual, and publish accordingly. Manipulation of the simpleminded was once frowned upon as “sharp practise”, but is now America’s biggest industry.

    1. Those are some pretty seriously false equivalencies. Are you in the Valley? Because it sure sounds like modern Valley false logic, and that is what got us here in the first place.

      If this is true, how can you presume to even know ‘what it was like’ if you were not even born, or at the least, too young to notice?

  4. I guess that’s it then. There is no major tech company left that has anything even resembling user privacy. I have been disappointed in the Cook Apple for plenty of reasons, but this takes the cake. The younger generation that inherited Silicon Valley have well and truly destroyed it.

  5. Just asking. If the scanning is “on device” will they still be scanning whether or not you use iCloud? What’s to stop them from doing this if the tech to do it is already on device?

    1. It has been on your device for years. How do you think Photos collects People albums? It scans the content of your images for faces and applies facial recognition. That is potentially far more dangerous than simply comparing your images against a database of known child sexual materials. Searches in Siri or Mail work because your computer scans all your text files. The train has already left the station.

    2. I think the scanning for pictures is on your own device. Your device then flags those pictures, indicating a match, in the cloud. The cloud is only scanned for those flags.
      So without the cloud, the flags are never seen and your phones detection of a picture is never observed.

  6. Yes, people can use the same back door for bad as well as good… but do any of you commenters here have children? Have any of you searched for pedophiles in your neighborhood? If so you’ll be shocked how many live so close to you. Apple’s tech is already being used for Good and Bad, by Taliban and al Qaeda to communicate terrorist plots and by pedophiles to take photos of your kid getting off the bus or coming out of school to meet you at your car. Ask yourself how you feel after the FBI nab that creep living 3 doors down you and they find 1,000 pics on his iphone including some with your kids on there!

    1. Yes, I have children. No, I do not want Apple destroying privacy and delivering the tools for Big Brother intrusions (ironic given Apple’s famous Mac ad).

      There are other ways to find pedophiles than by destroying everyone’s privacy.

      “Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety.” – Benjamin Franklin

    2. Get real with the paranoia. Pedophilia, while totally disgusting, is a minuscule problem by the numbers. Nearly every parent takes pictures of their little kids splashing around in the tub or whatever, which means every parent is suddenly at risk from this kind of snooping. Let’s haul in 1000 or 10000 parents to catch one pedo…seems like a great idea.

      1. You are only at risk if you upload your baby bath pictures to a kid porn site, wait until they are added to the CSAM database, and then try to upload them to iCloud. That probably does not apply to thousands of innocent parents.

  7. I have rarely read so much willful ignorance as all this. It is not a “back door.” It is 100% under the control of the device owner. Don’t want your photos scanned before they are uploaded to iCloud? Then don’t upload your photos to iCloud! Problem solved.

    If you use iCloud, Google, or another cloud storage system, realize that they are already being scanned for child pornography now on the server side, and this would replace that with a much less intrusive system. Your iPhone is already scanning all your pictures with facial recognition, which is potentially far more dangerous, but nobody complains about that. That information stays on your device unless you choose to share it.

    If you don’t like that, then don’t use a cloud storage system or lobby Congress to repeal either the Communications Decency Act or the laws against sexually exploiting children. Don’t blame Apple for trying to obey the law in the least intrusive manner they can find.

    1. So blathers the King of Willful Ignorance. Mr. Useful Idiot, naive pawn of the corrupt globalist establishment, of which Tim Cook is certainly a part.

  8. I never liked any of the cloud, Apple or otherwise. It comes down to this, “Do you feel comfortable leaving your data on another person’s computer?” I never did.

  9. The gouv has been forcing the big tech to let CSAM check on their cloud server… AND BE LIABLE FOR CONTENT ON THEIR CLOUD. Apple is, at this point, protecting itself from being cut dry with childporn on iCloud. At this point, fuoff privacy and the big marketing bullsh. Let the “me first” assure itself.

    The can of worm is open. Let’s see what will be next in terms of socially non-acceptable.

    So long for the privacy/security market pitch. They will use new terms for it and I think we gonna love it!

  10. CCP China pressed him for the back door and he caved. The social responsibility is just a smoke screen.

    Big tech and government have already laid the ground work that unapproved speech and opinions are dangerous and cause death. So “for the children” is the first step but it’s not the last step and it’s not the real reason the back door is being installed, it’s just the excuse. Somehow I doubt Apple will use their strong arm to hold Amazon web services or the CCP slave labor camps accountable.

  11. And you can be certain that Apple has already been doing this for years to “test” it. It’s forever surveillance from Big Brother Apple. Like many totalitarian initiatives, they sell it as “protecting the children”, but don’t be shocked if TC and the entire leadership team are compromised by “indecent images” now leveraged against them. The creatures in the shadows want to know everything about you.

  12. I am really amazed how people are defending this.

    First, the motive (fighting child porn) is just an enabler to get the tech on your phone, nothing else. ‘Think of the children” always works.

    Great concern for me is about why to trust that CMAS database in the first place? What people are running it? How are they audited? Is there independent oversight? How can one actually verify for oneself that the database indeed only contains material for said purpose?

    What is to stop nefarious people from the intelligence and security sector, or corrupt or criminal people, to secretly add their own images to be checked (Find my Annoying Journalist, Find my Whistle Blower, Find my Political Opponent?? Or somehow link their own database? How would one know this not to be the case? Regimes from all over the world will come with their own databases to hook up, with all legal provisions in their own country taken care of. No doubt about that.

    For now it is limited to Apple’s iCloud – that sadly now definitely wont see end to end encryption, that much is clear – but who is to say that won’t change to a broader scope? Moving goalposts, mission creep, these are all but guaranteed to happen.

    But most principally: how does it even help? I have young children myself. Does this help against a pervert sending his erect thing or her big boobs? Or other self made material? No, not from what I can see. What real world scenario does it actually adres? What I have come to understand over the years, is that pedophiles work very refined, ahead of the curve, and probably know very well how to avoid tech like this. They won’t use these cloud services. So what does it all actually accomplish, other than putting this dangerous slippery slope tech on all our phones?

  13. Apple’s new ad: What’s on your iPhone stays on your iPhone, until we say otherwise, get pressured by the government, or just don’t like what you say or what you have on your device.

    The issue is not that Apple is doing this, because other big companies already do, the issue is that Apple markets itself as a privacy champion. Tim Cook, your said…At Apple we think privacy is a human right. I’m calling Bull Shit.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.