Center for Democracy & Technology: Apple’s backdoor scanning of Photos and Messages threaten users’ security and privacy

The reception to Apple’s backdoor scanning of Photos and Messages is clearly not the warm welcoming one for which Apple executives may have naively hoped by couching it within a laughably contrived Think of the Children™ introduction, as the Center for Democracy & Technology has plainly stated that the move “will threaten the security and privacy of its users and ultimately imperil secure messaging around the world.”

Center for Democracy & Technology: Apple’s backdoor scanning of Photos and Messages threaten users’ security and privacy

Apple on Thursday said it will implement a system that checks photos on iPhones in the United States before they are uploaded to its iCloud storage services to ensure the upload does not match known images of child sexual abuse, but some privacy advocates said the system could open the door to monitoring of political speech or other content on iPhones. The Electronic Frontier Foundation (EFF) said in a statement that, “Apple is planning to build a backdoor into its data storage system and its messaging system.”

The Center for Democracy & Technology:

[On Thursday], Apple announced that it is planning to make several changes to its messaging and photo services in the United States which the Center for Democracy & Technology (CDT) believes will threaten the security and privacy of its users and ultimately imperil secure messaging around the world.

Apple describes these new policies as an effort to protect children, which is unquestionably an important and worthy goal. Proliferation of child sexual abuse material (CSAM) is an abhorrent crime against which firm action is required. However, CDT is deeply concerned that Apple’s changes in fact create new risks to children and all users, and mark a significant departure from long-held privacy and security protocols.

“Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship, which will be vulnerable to abuse and scope-creep not only in the U.S., but around the world,” says Greg Nojeim, Co-Director of CDT’s Security & Surveillance Project. “Apple should abandon these changes and restore its users’ faith in the security and integrity of their data on Apple devices and services.”

The changes Apple announced today create a backdoor, so that iMessage will no longer provide end-to-end encryption. These changes also create a dangerous precedent for allowing one account to essentially conduct surveillance of another. More specifically, Apple will add a feature to iOS that scans images in iMessages sent to and from users if they are on a family account. On these accounts, Apple will conduct machine learning-based “client-side scanning” in an attempt to detect sexually explicit imagery.

When the system detects a suspected “explicit” image to or from a child user on a family account, it will warn the user that the image is sensitive and notify them that a notice may be sent to the parent if the young person chooses to send or view the image.

The company also announced changes to its photo storing policies. Apple will store a database of hashes (small strings of data that serve as a fingerprint for an image) of child sexual abuse material (CSAM) on users’ phones. For users that have enabled iCloud photo storage, the operating system will check a user’s photos against the database before uploading them to iCloud. If a given account reaches a pre-set threshold of images that match the database, and a human reviewer agrees that the images depict CSAM, Apple will submit a report to the National Center for Missing and Exploited Children (NCMEC). Apple will also suspend the account, subject to appeal by the account owner.

These new practices mean that Apple will no longer be offering fully end-to-end encrypted messaging through iMessage and will be undermining the privacy previously offered for the storage of iPhone users’ photos. The mechanism that will enable Apple to scan images in iMessages is not an alternative to a backdoor — it is a backdoor. Client-side scanning on one “end” of the communication breaks the security of the transmission, and informing a third-party (the parent) about the content of the communication undermines its privacy. Organizations around the world have cautioned against client-side scanning because it could be used as a way for governments and companies to police the content of private communications.

The changes to iMessage’s privacy for users under 18 are particularly concerning because there is no guarantee that the parent-child account structure Apple has devised will be used as intended by actual parents and young people. The same tool that Apple intends to be used to fight predators seeking to “groom” potential victims could expose sensitive information about young people’s sexual identities to unsympathetic adults. And machine-learning classifiers for detecting nudity and sexually explicit content are notoriously error-prone; it’s almost certain that Apple’s new tool will mistakenly flag health information, memes, art, and advocacy messages as “explicit” and send alarming notifications to parents that imply their child is sexting.

Nojeim says, “The changes Apple announced are extremely disappointing, given the leadership and commitment to user privacy and communications security it has long demonstrated. Apple’s retreat from providing secure end-to-end encrypted services opens the door to privacy threats for all users, while creating new threats for young people. In particular, LGBTQ youth and children in abusive homes are especially vulnerable to injury and reprisals, including from their parents or guardians, and may inadvertently expose sensitive information about themselves or their friends to adults, with disastrous consequences.”

“Images attached to messages that were previously protected by end-to-end encryption will now be searched routinely using algorithms that have not been revealed to the public. And users who expect privacy in the photos they take and share with their iPhones can no longer have that expectation when those photos are backed up to iCloud. Instead, they should know that Apple will scan those photos,” Nojeim adds.

MacDailyNews Take: Apple move is so perplexing, instantly destroying years of claims of privacy protection from the CEO on down, we wonder if Apple was promised something – implicitly or explicitly – in return for this faith-shattering and, yes, stupid decision?

There have been people that suggest that we should have a backdoor. But the reality is if you put a backdoor in, that backdoor’s for everybody, for good guys and bad guys… I think everybody’s coming around also to recognizing that any backdoor means a backdoor for bad guys as well as good guys. And so a backdoor is a nonstarter. It means we are all not safe… I don’t support a backdoor for any government, ever.

We do think that people want us to help them keep their lives private. We see that privacy is a fundamental human right that people have. We are going to do everything that we can to help maintain that trust. — Apple CEO Tim Cook, October 1, 2015

Pfft.

In exchange for selling out their billion-plus iPhone user base, would the U.S. and other governments go easier on Apple in the realm of antitrust or taxes or something?

Apple’s decision is such a disappointing, hypocritical, and obvious 180° capitulation that there has to be more to this story.

Why would Apple — which was, for years, so vehemently pro-user-privacy and so adamantly opposed to adding backdoors into their devices — make such a shocking, self-defeating, and embarrassing about-face, painfully and desperately introduced under the guise of Think of the Children™ in a failed attempt to soften the blow?

See also: How to jailbreak your Apple iPhone

34 Comments

  1. The TL;DR version: If you use iCloud Photos, they will do a pattern-matching scan looking for large numbers of images that that have the same cryptographic “fingerprint” of known pornographic images of children. If a threshold of potential matches is exceeded, that means there could be a problem, where investigation by law enforcement might be warranted. If you are a parent, and setup and Family iCloud account, you can choose to enable an option to alert if your kids might be sexting. That’s really it. Apple doesn’t look at your images in either case. Scanning/analysis is done locally on device, if these features are used. You don’t have to use these features.

    1. Your attempt to minimize and lies of omission are not going to work, Tim, uh, SM.

      1. In this age of OVERREACTING TO EVERYTHING, ALL THE F&*#JKG TIME, it’s hard to get anyone to calm down long enough to have a rational discussion. For the haters who will only ASSume the worst about Apple, this probably isn’t worth your time, but for those actually interested in the other side of the story:

        https://www.apple.com/child-safety/

        1. Right, children buy iPhones. This is a total capitulation of Apple terms of service and a TOTAL BETRAYAL of security Cook preached for years.

          Remember Cook righteously defending not opening up iPhones to catch terrorists during the Trump administration? What changed?

          The baloney save the children ruse nonsense. Big Bother Apple now lives in your phone and open to law and government agencies gaining easy access to private phones the Constitution right to privacy be DAMNED!

          Welcome to 1984, Apple Big Brother is watching you…

    2. FACISM MASKED WITH GOOD INTENTIONS. Presently, law enforcement can access your iPhone anytime they want with a, warrant, based on evidence, we call that the 4th amendment. What Apple is proposing, circumvents the rule of law and puts themselves and their “partners” as the decision maker for you of what’s right and wrong.

      1. You nailed it. I don’t want anybody messing with my stuff and don’t want any tech company surveilling me or my family. This is so wrong Apple. Privicy is more important than Apple. I am willing to ditch all of my Apple devices to stop this madness.

    3. SM, as you can see from Ed’s comment, facts are irrelevant to the Apple critics. What the company says is less credible to them than the word of a guy who turned 1.5 million classified US documents over to the Russian intelligence services. The lives of US agents and the victims of child abuse are irrelevant to their view of this debate.

      1. TxUssr,

        On a mission to prove himself to be the most gullible fucktard ever to walk the face of the earth.

      2. Sometimes it’s the rebel and lawbreaker that’s needed to clean the deck of the dreck.

        It’s called a whistle blower. You like them when they are blowing the guy you don’t like (Vinman).

        Perhaps you kind of like when the govt is listening to things deemed private?

        1. I love whistle blowers. They are people with a credible fear that their employer is breaking the law and will ignore their complaints who go to an agency granted power by law to investigate the employer and bring them to justice.

          Sort of like a National Security Council employee who reports wrongdoing to the Council’s lawyer and to the Inspector General of the Defense Department, and then testifies under subpoena before a body charged by law (and the Constitution) to investigate precisely such matters.

          It does not include a contract worker for the National Security Agency who makes no official complaint to anybody, but instead steals 1.5 million classified government documents concerning national security matters and goes to the press with them, and when that goes sour takes them to Russia with him.

          Snowden had precisely the same opportunity as Vinman to act as a legal whistleblower, but chose instead to act in a manner that did untold damage to American national security.

          He has every opportunity to vindicate himself at trial, if he chooses to do so. Instead, he is hiding from extradition in Russia.

        2. Here is my deflection and at least I’m not a little man like you and fully admit as such!

          The latest Hunter Biden sex and illegal drug encounter recorded. What, not Trump’s son so no wonder CNN is not hyperventilating 24/7 calling for justice — simply does not apply to despicable Demonrats.

          Apply justice equally and prosecute the low life alleged criminal…

    4. I’ve believed Apple when they told me that my data is mine, and they’re helping me protect my assets. This policy seems to say that Apple “SUDDENLY” can read my data. From my career, I did accept the assumption that it was actually the set of Apple’s internal ethics was protecting me, and with the concept embedded in the licenses, and the company statements that was ok for me.

      Suddenly to state this is a denial of their marketing over the last 2 decades … I’m pissed. I was ok, just knowing that without a legitimate demand via the courts, Apple would never participate in such an exercise; now suddenly my data, with no child pornography nor child labour laws violated could still be exposed to some technician who would have to check the scan results to verify that there was no algorithmic errors in the scan. Even if it’s Tim Cook himself, I haven’t given him permission to view any of my data.

      The tech preview of scanned data is clearly required, otherwise Apple could publish false positives to the authorities, in spite of the fact that a false result can also destroy lives.

      I consider this Apple announcement to be a disaster, and a revocation of every promise and warrant of ownership between Apple the Apple software and hardware users.

      Given the number of us that save documents in picture format for security purposes, the new policy will be exposing Apple to many contractual terms and will potentially have major legal problems with our loss of privacy.

      I would never allow any legal authority access to my data without due processes applicable to a citizen of my country (Canada) which include a court order. This becomes a much bigger problem for companies that have provided free or inexpensive cloud storage, such as Apple.

      At a bare minimum, I will ask Apple to provide a simple utility for its users that will collect easily 100% of a user’s data in Apples’ cloud servers, and readily transfer all of it to a users own local cloud server, and then allow total erasure of anything still found in Apples’ access.

      1. Just playing devil’s advocate. If Apple only identified the filename of the suspect file and allowed the user to submit said file themselves for verification that it is NOT an instance of child pornography then it may be a bit palatable than having a third party go in and look themselves. That way they don’t get to ‘view’ anything you didn’t allow them to. Apple could run their ‘check’ on the file to verify that the file you submitted was the one they flagged and not switched out with a different file to save face.

        1. Where in the Constitution does it say that a private company cannot screen the material on its own servers pursuant to an agreement with its customers? In any case, the Fourth Amendment (like the First) only applies to governments, not private parties.

        2. I rely on Apple’s various statements to the effect that my information is held private for me, and that Apple has no interest (nor ethical reason) to ever view my data. I referenced warrants in that I expect that Apple won’t even release my metadata to govt organizations without having received appropriate warrants from appropriate agencies following correct procedures.

  2. …and Rome fell

    Many wondered what would it take to bring Apple down, like many great companies and countries, its rot from within.

    At this point, even if Apple did a 180, their stellar reputation on privacy is now trash because we know they’ve entertained the idea in the first place.

  3. If this does not change, to change from a free society to basically a surveillance society is all but inevitable. It is hard to believe still that of all the companies, it is Apple that caved most monumental now.

    I’m from 1962, and grew up (and still live there) in thew northern part of the Netherlands. From here, the border to East Germany, the DDR (GDR in English acronym) was only about 250 miles away. The DDR was the most notorious in suppressing it’s citizens into a horrendous surveillance state – orchestrated by the STASI secret service – besting even the Soviets themselves at totalitarianism and effectivene suppression.
    I am quite sure China took more lessons from the DDR of that period than the USSR.

    The people in that surveillance state lived in 24/7 fear. Fear of being caught for wrong thinking, for wrong writing, for wrong speaking. Fear especially for ones own children, as through the school system the authorities made sure that children would betray their own parents and family, with great success. Fear of gathering, fear of meeting the wrong corrupt people. Horrible economy, constant risk of food shortages.

    People desperately wanted to flee that, but couldn’t because of the Iron Curtain, designed to keep people inside. I remember vividly the constant news reports about again civilians being shot by the border guards in the back, for trying to jump the wall and seek freedom. Sometimes the bodies would hang in the wires for hours or even one or two days before they got collected. This continued for decades. One got used to these news reports.

    The stuff that came – and still comes – to light after the fall of the Iron Curtain surpassed anyone’s deepest suspicions about the level of what a surveillance state does to its citizens. Betrayed by your neighbors, friends, children. And then there were Apparatchiks who delighted in this and make their careers out of all that. All beyond horrible.

    All people should be alarmed at slow but potentially serious steps in the direction of all matters surveillance, as it inevatably will lead to a surveillance state. Perhaps in a new form – a privatized form largely – but that does not matter. Freedom and principles of justice must be absolute

    Ultimately all authoritarian surveillance structures are the same, and must be avoided in its most earliest stages at all cost. AT ALL COST.

    1. Panda, your comments are very important.

      Unfortunately, Americans are victims of their own success. They have not suffered any deprivation in over 80 years, 4 generations. They cannot even imagine what it would be like to live in a surveillance state where each behavior and thought is controlled, and “wrong” ones are punished.

      And to those who think they are being clever by using “choice architecture”, I would say that in fact you’re being too clever by half, to put it charitably. An architected choice is no choice at all, and people can see clearly through that façade, unfortunately only after they realize they can’t do what they really want.

      And then there are those evil ones who sing a lullaby while holding a knife behind their backs, the lyrics to which go something like, “Hey, you’re being hysterical! Nobody’s talking about xxx!”, and then xxx happens and then they say something to the effect of “Well, it’s already a fact of life! What do you think you signed up for?”. The camel puts its nose under the tent and says, “Hey, stop complaining! It’s only a nose! I’m not saying I want to get in the tent!” Then the camel is in the tent and he says, “Hey! I’m here, and you let me in in the first place! You have no right to say anything! STFU!”

      A society living under constant surveillance and false choices is a demoralized, miserable and unproductive society. The Elites who watch the cameras, architect the choices and inflict pain for “incorrect behavior” preside over a living hell, but it is their motivation to do so.

      So, call me a child molester, a knuckle dragger, a moron, or anything else you think will distract people from the truth. Play shell games with facts and logic, distracting people from it as well. Those of you doing so will be spending the rest of your miserable lives playing just that game, always looking over your shoulder for all the “comrades” that are trying kill you too as you rule over poverty and misery.

      But hey, it’s all “for the children”, right?

    2. I fully agree. Thing are detoriating so rapidly. Tech censorship all over the place. Tech thinks they have the right to decide what we see, hear and think. Wrong. Apple is the same as Facebook and Google and the likes. Evil.

        1. “perfect Democrat police state lackey”

          Yes indeed, TxUseless posts the evidence sometimes several times a day.

          But critical thinkers immediately spot partisan fool topics and only takes a few key words.

          I would simply add, as we have read lately on this topic, TxUseless is also the Numero Uno Apple apologist “LACKEY.”…

      1. It wasn’t much fun seeing what Google, FB and Twitter did in the past year…atop compromised media and the absence of a truly functioning legal system. It wasn’t and fun and far from a cute little tempest in a teacup.

        Take that history away and the recent Apple news would likely not have the same impact. Recent historical occurrences have conditioned reactions.

    1. Whatever, you are naive. And you’re believing what you’re told. The concern isn’t simply this limited use they intend to implement, it’s the use of this in the future for things completely different from why (they say) Apple is doing it now.

    2. The concerns are not about the facts in its current iteration – I know them too – but where it can potentially lead to now that the gauntlet of surveillance tech on ones own device has been thrown. And where it most certainly already will lead to in the shorter term in current authoritarian states.

      Moving goalposts got us here now (end-to-end iCloud encryption certainly in the trashcan now, the no-backdoor promise also becoming more loose), so who knows what moving goalposts and mission creep we will see the coming years? That is what the concerns are about once the tech is in place.

      Of course the current iteration would look as innocent and noble as possible, that is what the “think of the children” strategy is meant and designed for. Works every time.

  4. Protecting terrorists and murderers by keeping FBI out, but porn is more important justifying full intrusion mode? This might be great non-connected technology for parents to have to protect their children, but Apple deciding to report “violations” to the government? Apple and other Big Tech companies have already proven to effectively be an arm of the government and of the far left. Once (if) this is accepted, it now opens the door for the coming of full surveillance. It’s not simply the end of Apple’s privacy, it’s yet another ally/resource of a governmental agenda that is already leading America to Marxism and communism — which is already operating successfully in places in the country. People have long since stopped learning from history, because they have long since stopped learning and understanding it in schools at all levels. In totalitarian countries, it was often the government itself that took away rights and imposed itself on its people. Here, we are doing it to ourselves through powerful private companies and the people themselves. The loss of Apple’s protections is serious since there is potentially no equivalent replacement for it. We can only hope they will quickly reverse course and back down. But will we be able to trust that they then wouldn’t clandestinely monitor us and report to the government in the future since the technology is already set to go.

  5. Besides absolute invasiveness what do they hope to accomplish with this??… it will solve nothing,
    Ones who have these tendencies and know that they are being watched.. they will simply circumvent it by using other platforms and Avoid Apple iCloud or Apple all together.
    This is a horrific pr/pub/business move by Apple… it feels Orwellian ! It Is ORWELLIAN !!!

    1. Can you suggest any platforms that aren’t already compromised? This just seems like Apple is finally the last big player to join to total surveillance club. It would be a tough pill to swallow but it may be time for a Linux phone, at the very least don’t update to iOS15 and get off of iCloud. It has started with photos and it will expand to everything you have if it hasn’t already.

      1. Which platform has blatantly announced that their os will have a built in surveillance spy-wear …

        Here is a company that has built its reputation based on promise of ‘privacy at all cost’ who is now, in your face, telling u that their software will come with a built in mechanism of surveilling the content in your phone!!!🤯.. Yes photos now… but whats next !
        2nd… turn off iCloud… that breaks the ecosys. another huge marketing point of Apple.!
        3rd.. no icloud… no need for extra storage anymore.. hurts their services revenue… a shot in the shareholders foot!

        I would not be surprised if there will be a class action lawsuit from shareholders /etc as this is not in anyway to their interest!….and is against all Apple stood for .. principals we believed in invested in and chose to pay premiums for… all gone !

        I for one wont upgrade to 15….
        This is total betrayal and hypocrisy on Apple side!

        Hope you are listening TIM ORWELL COOK!!!!

  6. Don’t worry guys this is just Apple bending the knee to the CCP. We have nothing to fear. And we know we can trust the CCP. The backdoor is per CCP request. Its got nothing to do moral responsibilities we all know that’s just for show. We know Apple is either weak or a moral fraud.

    https://www.theinformation.com/articles/seven-apple-suppliers-accused-of-using-forced-labor-from-xinjiang

    Your data can be trusted.

    @1.56 companies where instructed to spy on individuals. Control of the App store isn’t enough for the CCP. They must have access to the phone. Apple’s new backdoor is for the CCP. The CCP can shut down the majority of apple’s manufacturing in a few hours. Apple is the CCP’s bitch.

  7. Your point is somewhat valid, but please, Apple’s platforms are not just for those of you in the USA. Many of us (I include myself) don’t actually consider the USA any less likely to violate my person and possessions than the CCP, or Russia, North Korea, or even Ukraine.

  8. Apple totally messed up. I want to own my data on my phone and don’t want any tech company scanning this and applying theor rules of what os wrong or right. If Apple think they can go this way, I am out. Privacy is more important than Apple.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.