If Apple no longer protects user privacy, why buy an iPhone or an iPad?

Apple has long made privacy central to its marketing pitch for iPhones and iPads, but with the looming installation of a backdoor to scan, ostensibly, for Child Sexual Abuse Materials (CSAM), Apple’s iPhones and iPads will no longer protect user privacy as robustly as they once claimed to do.

EFF: Apple's photo scanning backdoor is a shocking privacy and security about-face

Jack Nicas for The New York Times:

Apple unveiled a plan two weeks ago founded in good intentions: Root out images of child sexual abuse from iPhones.

But as is often the case when changes are made to digital privacy and security, technology experts quickly identified the downside: Apple’s approach to scanning people’s private photos could give law enforcement authorities and governments a new way to surveil citizens and persecute dissidents. Once one chip in privacy armor is identified, anyone can attack it, they argued.

“Once you create that back door, it will be used by people whom you don’t want to use it,” said Eva Galperin, the cybersecurity director at the Electronic Frontier Foundation, a digital-rights group. “That is not a theoretical harm. That is a harm we’ve seen happen time and time again.”

Apple was not expecting such backlash.

MacDailyNews Take: Really, after the quote below, Apple wasn’t expecting such backlash? Why the hell not?

Apple Inc., “A Message to Our Customers,” February 16, 2016:

Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.

Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.

The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control…

We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them…

The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.


Privacy means people know what they’re signing up for, in plain English and repeatedly. I believe people are smart and some people want to share more data than other people do. Ask them. Ask them every time. Make them tell you to stop asking them if they get tired of your asking them. Let them know precisely what you’re going to do with their data. — Steve Jobs

Cybersecurity experts, the head of the messaging app WhatsApp and Edward J. Snowden, the former intelligence contractor who leaked classified documents about government surveillance, all denounced the move as setting a dangerous precedent that could enable governments to look into people’s private phones. Apple scheduled four more press briefings to combat what it said were misunderstandings, admitted it had bungled its messaging and announced new safeguards meant to address some concerns. More than 8,000 people responded with an open letter calling on Apple to halt its moves.

As of now, Apple has said it is going forward with the plans. But the company is in a precarious position. It has for years worked to make iPhones more secure, and in turn, it has made privacy central to its marketing pitch…

[Apple’s new software] scans users’ iPhone [and/or iPad] photos and compares them against a database of known child sexual abuse images.

To prevent false positives and hide the images of abuse, Apple took a complex approach. Its software reduces each photo to a unique set of numbers — a sort of image fingerprint called a hash — and then runs them against hashes of known images of child abuse provided by groups like the National Center for Missing and Exploited Children.

If 30 or more of a user’s photos appear to match the abuse images, an Apple employee reviews the matches. If any of the photos show child sexual abuse, Apple sends them to the authorities and locks the user’s account. Apple said it would turn on the feature in the United States over the next several months…

Computer scientists, as well as privacy groups and civil-liberty lawyers, immediately condemned the approach… Other tech companies, like Facebook, Google and Microsoft, also scan users’ photos to look for child sexual abuse, but they do so only on images that are on the companies’ computer servers. In Apple’s case, much of the scanning happens directly on people’s iPhones. (Apple said it would scan photos that users had chosen to upload to its iCloud storage service, but scanning still happens on the phone.)

To many technologists, Apple has opened a Pandora’s box. The tool would be the first technology built into a phone’s operating system that can look at a person’s private data and report it to law enforcement authorities. Privacy groups and security experts are worried that governments looking for criminals, opponents or other targets could find plenty of ways to use such a system.

If governments had previously asked Apple to analyze people’s photos, the company could have responded that it couldn’t. Now that it has built a system that can, Apple must argue that it won’t.

In response, Apple has assured the public that it will not accede to such requests. “We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future,” the company said in a statement.

MacDailyNews Take: Bullshit.

When Apple removed VPN apps – or The New York Times app, or whatever app or service (Apple Books, for one example) the CCP demanded – from the App Store or Apple devices in China, the company told the developers of ExpressVPN, one of the apps affected, that its app was removed because “it includes content that is illegal in China.” In a subsequent statement, Apple said: “Earlier this year China’s MIIT announced that all developers offering VPNs must obtain a license from the government. We have been required to remove some VPN apps in China that do not meet the new regulations.”

Apple, which famously explains away the banning of apps in certain countries by saying that the company must comply with all local laws, might refuse requests without accompanying laws, but, based on a long history, will not refuse requests based on passed laws from governments to use the system to check phones for whatever oppressive governments come up with to monitor, categorize, and punish citizens who do not comply with official doctrine.

Apple, which, under Tim Cook, has attached itself to the China teat so firmly as to be hopelessly compromised, is lying by omission. Yes, they’ll refuse a “request,” but what will they do when it becomes law to scan for political, religious, sexual orientation, etc. materials? Apple will follow the local laws, as they have in every known case of app banning/service removal.

Apple should never introduce this system into China because Apple is not going to stand up to the CCP when the CCP could severely hamper or even shut down Apple’s business in a Bejing minute.

Xi Jinping walks Tim Cook on a leash

If the U.S., the first country where this scanning is planned, passes a law (for example, to scan for “terrorists,” using a broad definition, of course, which would allow for scanning for virtually anything in every photo on the device), history shows that Apple will follow the law.

Apple will refuse requests from governments until it becomes law. Then, as usual, Apple will follow the laws in each country where it operates.

For years, technologists have argued that giving the police a way into phones would fundamentally undermine the devices’ security, but now governments can point to Apple’s endorsement of its photo-scanning tools as a method that helps the police while preserving privacy.

MacDailyNews Take: So, if Apple no longer protects user privacy, why buy an iPhone or an iPad?

Well, build quality would be among the reasons that spring to mind – although Samsung seems to have figured out how to offer in-display fingerprint ID, pinhole cameras without giant unsightly, content-occluding notches, advanced foldable displays, etc. while the images produced by Google’s Pixel phone cameras are often rated higher than Apple’s best iPhone’s, so that argument isn’t as cut-and-dried as it once was.

Apple’s ecosystem, even if it will no longer protect users’ privacy to the same degree as Apple previously claimed and promoted, is still much more unified across Apple devices with Continuity, so users of multiple Apple devices can certainly benefit. Apple Watch users, of course, will need to stay with iPhone as well.

Currently, iPhones and iPads hold their value much better than competing devices. iPhones and iPads also, in general, get the best apps first; even today, many Android apps are lesser ports of iOS /iPadOS apps.

Apple’s famous ease of use is an advantage over Android phones and tablets, although Google narrows that gulf with each new Android release.

Another consideration is Apple Pay. If you use Apple Pay and want to continue using it, you’ll need an iPhone. Of course, many alternatives, like Google Pay, exist for Android phones, tablets, and watches.

Finally, Apple’s chips are faster and Apple’s control of the whole widget makes for generally faster, smoother, more efficient devices.

So, even if Apple no longer protects user privacy to the extent they once claimed, there are still plenty of reasons to buy an iPhone or an iPad over competing choices.

24 Comments

  1. I really don’t understand the furor over Apple’s proposed CSAM system. Google, Microsoft, and Facebook already openly scan all user images in the cloud for CSAM. Where is the uproar about that?

    Apple has found a way provide full cloud encryption while still dealing with CSAM. It does so in a way prevents Apple from actually looking at users’ cloud photos. There is no intrusion into user privacy whatsoever until there is a clear evidence of a threshold level of known kiddie porn in the user’s possession Even then, there are checks and balances to further protect privacy in the event of false positives.

    Apple’s solution satisfies the needs of law enforcement without wholesale snooping into people’s photos. It’s a better solution to balance the needs of society with user privacy.

    Does privacy really mean the freedom to traffic kiddie porn images on cloud servers?

    Maybe Apple should just throw in the towel and scan iCloud images in the open like everybody else is already doing. I wonder if Apple would get a pass for doing that?

    Any theoretical misuse of Apple’s CSAM system can already be done in the open with unencrypted cloud photo libraries. In fact, the slope is much more slippery without end-to-end encryption.

    Apple is Damned if they do and damned if they don’t…

    1. It’s not difficult to understand unless you don’t have the mental capacity to extrapolate or do not desire to do so.

      19 August 2021

      Tim Cook
      CEO, Apple, Inc.

      Dear Mr. Cook:

      The undersigned organisations committed to civil rights, human rights, and digital rights around the world are writing to urge Apple to abandon the plans it announced on 5 August 2021 to build surveillance capabilities into iPhones, iPads, and other Apple products. Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material (CSAM), we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.

      Apple announced that it is deploying a machine learning algorithm to scan images in its text messaging service, Messages, to detect sexually explicit material sent to or from people identified as children on family accounts. This surveillance capability will be built right into Apple devices. When the algorithm detects a sexually explicit image, it warns the user that the image may be sensitive. It also sends a notice to the organiser of a family account whenever a user under age 13 chooses to send or to receive the image.

      Algorithms designed to detect sexually explicit material are notoriously unreliable. They are prone to mistakenly flag art, health information, educational resources, advocacy messages, and other imagery. Children’s rights to send and receive such information are protected in the U.N. Convention on the Rights of the Child. Moreover, the system Apple has developed assumes that the “parent” and “child” accounts involved actually belong to an adult who is the parent of a child, and that those individuals have a healthy relationship. This may not always be the case; an abusive adult may be the organiser of the account, and the consequences of parental notification could threaten the child’s safety and wellbeing. LGBTQ+ youths on family accounts with unsympathetic parents are particularly at risk. As a result of this change, iMessages will no longer provide confidentiality and privacy to those users through an end-to-end encrypted messaging system in which only the sender and intended recipients have access to the information sent. Once this backdoor feature is built in, governments could compel Apple to extend notification to other accounts, and to detect images that are objectionable for reasons other than being sexually explicit.

      Apple also announced that it would build into the operating system of its products a hash database of CSAM images provided by the National Center for Missing and Exploited Children in the United States and other child safety organisations. It will scan against that database every photo its users upload to iCloud. When a preset threshold number of matches is met, it will disable the account and report the user and those images to authorities. Many users routinely upload the photos they take to iCloud. For these users, image surveillance is not something they can opt out of; it will be built into their iPhone or other Apple device, and into their iCloud account.

      Once this capability is built into Apple products, the company and its competitors will face enormous pressure — and potentially legal requirements — from governments around the world to scan photos not just for CSAM, but also for other images a government finds objectionable. Those images may be of human rights abuses, political protests, images companies have tagged as “terrorist” or violent extremist content, or even unflattering images of the very politicians who will pressure the company to scan for them. And that pressure could extend to all images stored on the device, not just those uploaded to iCloud. Thus, Apple will have laid the foundation for censorship, surveillance, and persecution on a global basis.

      We support efforts to protect children and stand firmly against the proliferation of CSAM. But the changes that Apple has announced put children and its other users at risk, both now and in the future. We urge Apple to abandon those changes and to reaffirm the company’s commitment to protecting its users with end-to-end encryption. We also urge Apple to more regularly consult with civil society groups, and with vulnerable communities who may be disproportionately impacted by changes to its products and services.

      Sincerely,

      Access Now (Global)
      Advocacy for Principled Action in Government (United States) African Academic Network on Internet Policy (Africa)
      AJIF (Nigeria)
      American Civil Liberties Union (United States)
      Aqualtune Lab (Brasil)
      Asociación por los Derechos Civiles (ADC) (Argentina) Association for Progressive Communications (APC) (Global) Barracón Digital (Honduras)
      Beyond Saving Lives Foundation (Africa)
      Big Brother Watch (United Kingdom)
      Body & Data (Nepal)
      Canadian Civil Liberties Association
      CAPÍTULO GUATEMALA DE INTERNET SOCIETY (Guatemala) Center for Democracy & Technology (United States)
      Centre for Free Expression (Canada)
      CILIP/ Bürgerrechte & Polizei (Germany)
      Código Sur (Centroamerica)
      Community NetHUBs Africa
      Dangerous Speech Project (United States)
      Defending Rights & Dissent (United States)
      Demand Progress Education Fund (United States)
      Derechos Digitales (Latin America)
      Digital Rights Foundation (Pakistan)
      Digital Rights Watch (Australia)
      DNS Africa Online (Africa)
      Electronic Frontier Foundation (United States) EngageMedia (Asia-Pacific)
      Eticas Foundation (Spain)
      European Center for Not-for-Profit Law (ECNL) (Europe)
      Fight for the Future (United States)
      Free Speech Coalition Inc. (FSC) (United States)
      Fundación Karisma (Colombia)
      Global Forum for Media Development (GFMD) (Belgium)
      Global Partners Digital (United Kingdom)
      Global Voices (Netherlands)
      Hiperderecho (Peru)
      Instituto Beta: Internet & Democracia – IBIDEM (Brazil)
      Instituto de Referência em Internet e Sociedade – IRIS (Brazil) Instituto Liberdade Digital – ILD (Brazil)
      Instituto Nupef (Brazil)
      Internet Governance Project, Georgia Institute of Technology (Global) Internet Society Panama Chapter
      Interpeer Project (Germany)
      IP.rec – Law and Technology Research Institute of Recife (Brazil) IPANDETEC Central America
      ISOC Bolivia
      ISOC Brazil – Brazilian Chapter of the Internet Society
      ISOC Chapter Dominican Republic
      ISOC Ghana
      ISOC India Hyderabad Chapter
      ISOC Paraguay Chapter
      ISOC Senegal Chapter
      JCA-NET (Japan)
      Kijiji Yeetu (Kenya)
      LGBT Technology Partnership & Institute (United States)
      Liberty (United Kingdom)
      mailbox.org (EU/DE)
      May First Movement Technology (United States)
      National Coalition Against Censorship (United States)
      National Working Positive Coalition (United States)
      New America’s Open Technology Institute (United States)
      OhmTel Ltda (Columbia)
      OpenMedia (Canada/United States)
      Paradigm Initiative (PIN) (Africa)
      PDX Privacy (United States)
      PEN America (Global)
      Privacy International (Global)
      PRIVACY LATAM (Argentina)
      Progressive Technology Project (United States)
      Prostasia Foundation (United States)
      R3D: Red en Defensa de los Derechos Digitales (Mexico)
      Ranking Digital Rights (United States)
      S.T.O.P. – Surveillance Technology Oversight Project (United States) Samuelson-Glushko Canadian Internet Policy & Public Interest Clinic (CIPPIC) Sero Project (United States)
      Simply Secure (United States)
      Software Freedom Law Center, India
      SWOP Behind Bars (United States)
      Tech for Good Asia (Hong Kong)
      TEDIC (Paraguay)
      Telangana (India)
      The DKT Liberty Project (United States)
      The Sex Workers Project of the Urban Justice Center (United States)
      The Tor Project (Global)
      UBUNTEAM (Africa)
      US Human Rights Network (United States)
      WITNESS (Global)
      Woodhull Freedom Foundation (United States)
      X-Lab (United States)
      Zaina Foundation (Tanzania)

      Source: Center for Democracy & Technology

      1. Clearly you and all of the organizations that have signed this letter are having a knee jerk reaction without understanding anything about how Apple’s new system is designed to work.

        But to the more salient point, no other major cloud photo libraries have encryption and they all scan for CSAM already. Apple’s iCloud Photo Library is currently unencrypted, BTW.

        Apple is attempting to change how CSAM is detected so that iCloud photos can be end-to-end encrypted going forward.

        Here’s a serious question: Is it better to store your photos in the open and have them scanned via image matching (current industry standard) or have your photos encrypted so that on one else can look at your photo library – with the possible exception being selected photos that have been flagged and re-flagged as CSAM by a privacy protecting system with baked-in checks and balances?

        It’s really not difficult to understand which system is better unless you don’t have the mental capacity to discern blatant privacy invasion from an actual attempt to preserve privacy.

    2. If you read the entire article, you’d have read this:

      “To many technologists, Apple has opened a Pandora’s box. The tool would be the first technology built into a phone’s operating system that can look at a person’s private data and report it to law enforcement authorities. Privacy groups and security experts are worried that governments looking for criminals, opponents or other targets could find plenty of ways to use such a system.”

      Maybe you read it and lack the ability to understand rather simple concepts.

      1. How is Apple opening a Pandora’s box by adding encryption to iCloud photo libraries that are currently unencrypted and totally accessible by law enforcement with a warrant? Going forward, law enforcement will not have access to iCloud photo libraries. They will simply be informed if a user’s library contains known CSAM and can use this knowledge to investigate the user in other way. They can’t ever get access to the photo library!

        1. As I understand it, Apple’s CSAM system has a 2nd automated pass using a ‘finer’ hash scan. If the image is still flagged as non-compliant a human actually views the image to confirm it was not a false positive. For CSAM this may not be too bad but imagine the hash being used to scan for and flag other types of images then having a human operator view the image to ‘make sure’.

    3. Well articulated PJS 👏🏻 – we now live in a world wherein most people like to fly off the handle and completely overact by reading headlines only and spew their uneducated negativity – which as you know are designed for clicks only, you are absolutely correct in saying almost all TECH companies (Google, Microsoft, Facebook, Instagram, TikTok, Twitter, Snap, etc) have been scanning these questionable images for years, however when Apple complies with the request it becomes an absolute S**T storm. The funny part is 99.9% of the complaining or concerned individuals using these other tech platforms are completely fine with this already installed technology and say nothing negative whatsoever, just like the APP store 30% fee controversy, Apple was hung out to dry with this issue and still is when in fact Google, Microsoft, Sony, Epic, etc all charge the EXACT same developer fees and yet absolutely no one was ranting on and on about them – c’mon people you are better than this.

        1. iCloud photo libraries are currently unencrypted, so bad governments can look at anything they want to now with a court order.

          How is Apple’s attempt to add encryption to iCloud photos while obeying child pornography laws a capitulation?

          1. The BOTTOM line is people with something to hide will constantly argue their dispicable child porn addiction ( there I said it – really didn’t want to ) agenda truly have issue with this already existent technology – period, did you understand that last sentence – the technology already exists on several platforms you already frequently use, educate yourself and open your mind before commenting in a rage, all other TECH companies have been scanning your photographs for years, look it up – that includes FB, MSFT, IG, WhatsApp, etc etc or are you intending to boycott them all, here I’ll answer for you – NOPE. FB is now starting to moderate comments because of the excessive negativity and rude comments – go to their forums and rant about the loss of your freedom of hate speech.

          2. You LIE.

            If you read the NYT article carefully, rather than cherry picking Apple positions you agree with, READ the WARNINGS and FACTS what Apple is actually doing.

            Example yes, iCloud Libraries are currently unencrypted. The article clearly states and I quote: “company scrapped plans to add more encryption [iCloud] when the F.B.I. balked.” Just stop promoting this LIE here more than once.

            Take off the rose colored glasses Apple Apologist, this is a total BETRAYAL of Apple security guarantees to customers for over 10 years and Pandora’s box is wide open if they go through with it.

            Backdoors are an open door. Spying on private citizens phones by Big Corporate in unacceptable regardless of the holy name cause. What part do you not understand?…

      1. You are clearly a millennial or gen z, and as with most of your cohort, all you see is whatever you are doing now; you aren’t capable of the critical thought that would give you a glimpse of future implications (incidentally, that is also why you do not make a ‘CEO’s salary’). You aren’t wrong: this is true of Google etc. already, has been for decades. That’s why we don’t use them.

    4. I believe the ‘furor’ is based on Apple’s Privacy claims they use in their advertising as the chief differentiator between Apple’s products and the ‘others’. Apple practically beats you over the head with how your privacy is no one else’s business, including Apple’s. No other company makes such a bold claim and till this CSAM fiasco, Apple appeared to pretty well hold the line. This privacy policy results in the positive side being you have confidence no one you haven’t given permissions to has access to your data. On the flip side it also means that those who use Apple’s tech for nefarious purposes are also protected from scrutiny. Any furor aimed at Apple here is well deserved.

  2. MacBreak Weekly has a lengthy discussion of how the technology works in English starting at around 4:08. It cleared some things up for me. Might be worth your time to watch.

  3. Personally I am far more concerned for the VICTIMS of child pornography – and I could give a damn for those who generate it, or are obsessed with looking at it. If Apple joins the other companies in observing images for child pornography I’ll continue to use the service because I support actions gains child pornography as I think that Apple will continue to lead the industry in protecting legal customers and their data.

    1. Yes! This!

      Ken, I am far more concerned with VICTIMS of terrorism also, and I could give a damn for those who participate in it, in any and all forms.

      I support actions against terrorism and Apple will continue to lead the industry in protecting legal customers and their data.

      Sorry, but terrorists have NO expectations of legal protections.

      BTW, Ken, what is your racial identification? We have a huge problem with systemic white supremacy you know. It is one of the major threats to national security at this moment. I am sure you will ally to stamp it out, yes?

  4. I got my GED rather than meet a community service requirement for high school graduation (allowd me to start college early as well). being treated like a criminal is something I have never been OK with. Making my phone examine my photos because they want to make sure I am not breaking any laws, is crossing a line. While I have been an iPhone user since before there were any Android phones, I am no looking at Graphene OS on a Pixel. To stop the switch and keep a user with hundreds of apps, Apple only needs to drop this stupid plan. They already lost my buying a new iMac when they dropped Intel (33 years of PC gaming made that an easy choice).

  5. Think it’s nice that An Apple product is aiding the cause of freedom in unjust countries. Too bad that’s going away.
    Let that be a lesson. When they tell you they care, you are a year or two from being sold down the river.

  6. Somebody’s dream job. Tim Cook is hiring

    Help Wanted
    We are searching for someone who would like to look at child pornography, and people’s private photos and videos all day every day. Not every pic with someone naked in it is child pornography, so you’ll have to look at lots of private naked photos, and watch numerous private sex videos to find the kiddie porn. Do you have what it takes to be superstar porn viewer? If so, please contact Tim Cook at Apple to join our team today.

  7. No one on this thread has any sympathy for child pornographers, that is clear, should be understood by all, and is not the subject of the opposition. The point is that some of us can see that this is the camel’s nose under the tent, and some cannot. It is that simple.
    Today it is child pornography, tomorrow it is a confederate flag, a rainbow flag or the absence of a flag. And no, I don’t use android, windows or any other System at the present time. This is a dangerous step in the wrong direction.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.