Privacy, Schmivacy. Apple installs backdoors to iPhones

Apple has announced its plans to bring changes to its operating systems that sound to many security researchers like a massive privacy nightmare and the perfect invitation to unintended consequences.

iPhone 12 Pro Max camera system

Rafia Shaikh for WCCFTech:

Raising concerns in the industry, the company argues it is doing so to protect children and limit the spread of Child Sexual Abuse Material (CSAM).

Two main concerning points are:

• Apple plans to add a scanning feature that will scan all photos as they are uploaded into iCloud Photos to see if they match a photo in the database of known CSAM maintained by the National Center for Missing & Exploited Children (NCMEC).

• It will also scan all iMessage images sent or received by child accounts (accounts designated as owned by a minor) for sexually explicit material. If the child is a minor, Apple will warn them if they try to send or receive sexually explicit photos and notify the parent.

However, security researchers, while supportive of the [CSAM] efforts, are concerned that Apple is enabling governments worldwide to effectively have access to user data, which could go beyond what Apple is currently planning, as is the case with all backdoors. While the system is being purported to detect child sex abuse, it could be adapted to scan for other text and imagery without user knowledge.

Security researchers around the globe have been writing about why this is effectively the end of privacy at Apple since every Apple user is now a criminal unless proven otherwise.

The new features are also concerning even without the government meddling and could prove life-threatening for queer kids.

Emma McGowan for Avast:

While combating CSAM is extremely important, privacy and security experts are concerned about the possible unintended consequences of this technology. [ Avast Chief Privacy Officer Shane] McNamee questions whether companies should scan people’s devices at all.

“Now that this is possible to have access, authorities will push for more access,” he says. “It’s like we’re peeking over your shoulder, but we’re wearing sunglasses and saying the sunglasses can only see bad things. And you have this little snooper on the device that’s just reading everything and checking it, not sending it to Apple unless you’re doing something wrong. That’s the problem — the definition of ‘doing something wrong’ could be broadened.”

Brianna Wu — a computer programmer, video game creator, online advocate, and Executive Director of Rebellion PAC who describes herself as “an Apple fan” — points out that the US government could theoretically create legislation giving them permission to use this technology without the general public ever knowing. There are “far less checks and balances” on behind the scenes deals between the US government and tech companies, in the name of national security, than the general public may believe.

“This would allow agencies to spy on our phones to find, say, pictures that the Pentagon says compromise national security or belong to terrorists,” Wu tells Avast. “And if you look at the specifics of Edward Snowden’s revelations, it’s clear that our national security agencies may stick to certain rules in the US, but outside there are no rules at all. I feel very confident this technology could be used to spy on people in other countries.”

The second big change is that Apple will allow parents to implement a program on their children’s iMessages that would blur any images with nudity. It will also alert parents if the child chooses to view the image or send nude images themselves. While Wu says she “can live with the iCloud part” of these new changes, she feels that the scanning messages part leads down “a deeply Orwellian road” and she “would beg Apple to reconsider.”

Wu points to the fact that the majority of US teens are sexually active before the age of 18 — and that “sexting” is not uncommon among teenagers. This technology, then, potentially infringes on teens’ right to sexual autonomy. It could also potentially open up charges of distributing child pornography against the children, if a parent reports, or the parents if they share the image with the other parents involved.

But even more concerning to Wu is the possibility that this technology could “out” LGBTQIA+ kids to their parents, potentially placing them in both psychological and physical danger.

“Name a product from Apple, I’ve got it,” Wu says. “I’m all-in on the Apple ecosystem because of privacy. I root for them to succeed, but this is by far the worst plan I’ve seen them put into effect. I think they’re going down a wrong path and it’s extremely concerning.”

MacDailyNews Take:

Interns: Make it a double.

58 Comments

    1. I am so happy. Been an Apple Fan Boy for Decades. Now I get to try other insecure platforms and save money. Sold my Apple stock and my Fiancee is selling her 40k. Since there’s no privacy, Google, Samsung, MS here I come. There is nothing that Apple can say or stop to remedy this.

      Do you remember the HomePod fiasco, people listening to you home conversations?
      Forget Apple Equipment coming out after today, it will be MS Surface Book, Samsung 49″ Monitor and Fold Phone, Garmin Watch (I had 3 Apple Watch 6) watches.

      True, I might die w/o the watch calling 911, but I deleted Apple Health data can’t trust that either.
      I see a storm brewing, Apple employees rebelling to work from home.

      Oh, Adobe Suite here I come, by Apple!

  1. This whole dispute is mass hysteria. Your device has been scanning your photos for years. Parental controls have been available for years. Apple has had the ability—and the legal right via your consent to their user agreement—to scan your content on iCloud for years. There is nothing fundamentally new here. If you don’t want to know if your children are exchanging pornography, don’t activate that feature. If you don’t want Apple to access your data on iCloud, don’t store your data on iCloud. Problem solved.

    1. For those that have actually read the EULAs this is not news. But like the current Critical Race Theory uproar (some teachers have been teaching some of this topic for decades), this ‘hidden in plain sight’ Apple policy is now big news because it it brought into the spotlight.

      The ‘new’ thing here appears to not really be iCloud but the scanning of images that go through iMessage. So while not using iCloud may be an option to remove one aspect of concern, no iOS user I know is willing to stop using iMessage also.

      1. The only iMessages being scanned are those on the devices of minors whose parents have activated the relevant parental controls. Obviously, the potential exists for scanning other messages, but that ability has existed for years. Nothing in this announcement affects that in the slightest. I would have thought that conservatives, who generally support hands-on parenting, would welcome this additional tool to help protect their children.

        1. “Your device has been scanning your photos for years.”
          And it should have been emblazoned on the app screen, not hidden in an EULA, a unilateral ungegotiated contract.

          Then as far as parental controls go… that’s determined by the owner of the device.

          I’ve been screaming on how Apple is the forced IT department for years.
          It’s also exactly why iOS devices can never, ever, be PCs.

          1. If somebody was too stupid to figure out that the “People” album in Photos required scanning all the images to look for familiar faces, no big notice was going to help them.

            Likewise for those who could not figure out that the CDA required Apple to take every reasonable precaution to avoid possessing child pornography on its servers.

            Likewise for those who expect Apple to allow parents to protect their children with optional parental controls, but could not figure out that blocking objectionable content requires looking for it.

            1. Let’s leave any objections to the law out of this for now.

              Apple advertises privacy and security. According to the law, they are not allowed to “sell” privacy as you point out. Now if they mean “within the law” then that should be emblazoned that you do not have the privacy and security you think you might.

              Also, it’s not yet clear to me if any software isn’t installed “on device” and that would then NOT be on their servers. Did they get my permission to bug my device? That should be emblazoned too!

        2. While the article does point out that the scanning of iMessage for images is limited by a parental setting to just child accounts at present, this feature seems insufficient to catch the actual adults sending CSAM images to other adults and may in short order not be limited to child accounts.

        3. Please answer simply….

          Apple’s recent announcement:

          is a step encroaching on individual’s privacy/security
          is a step with no material difference in privacy/security
          is a step fortifying a user’s security/privacy

          Hold the verbosity and conflation. You can resume your tendency after selecting an option.

          1. Fing, it isn’t that simple, but if I had to choose, I would say that it fortifies user privacy/security because more client-side screening on your device means less server-side screening by Apple.

      2. The “new” thing includes Apple notifying the government, the police, about their changing definitions of what Apple deems allowable to be on your phone and in your account.

        1. Mr. Wells:

          Again, not a new thing. Since 1996, Apple has been complying with the Communications Decency Act by taking all reasonable steps to discover any child pornography on its own servers. As the law requires, they report it to law enforcement when they find it.

          Apple isn’t setting the standards or “deeming what might be allowable.” It is comparing images being uploaded to iCloud against a database of hashes that match known child sexual abuse images compiled by an independent agency. These are not “baby in the bath pictures.” They portray hardcore sexual acts perpetrated on children. The commercial traffic in such images finances human trafficking of those children. To the best of my knowledge as a former prosecutor, there are no “changing definitions” of what conduct constitutes child sexual abuse.

          The CDA really only offers Apple two choices: it can vigorously defend itself from having child abuse materials uploaded to its servers, or it can shut down iCloud.

          1. Tx, thanks for your comment.
            I believe you’re incorrect though. The law has not required a company to use developed software and reporting to this degree or we’d be seeing this in all other companies. In fact, if the law required this, it would almost assuredly require a lower reporting standard than Apple says it is setting. The opportunities for modification and abuses are the main concern here, not this one on-your-phone filter. It’s the potential other uses that concern people.

    2. It’s not limited to iCloud. It scans on device. That’s the difference. It’s major. And it is a,back door. Apple totally messed up big time.

      And this means I no longer trust apple as a default and will now start actively looking for options to leave. They sold me privacy but all I got was a back door. The “think of th children” trope constantly gets trotted out to take our freedoms. Cannot believe apple is betraying privacy after so much work and words saying how important it is.

    1. Did you read the article? The entire point is that it could very easily become a back door, and in a manner that none of us even knows about, and that it is pretty much a complete 180 to what Tim Cook assured us of, rather vocally. FWIW, people said precisely the same thing about Google or Facebook more than a decade ago (‘What’s the big deal?’), and we all know how that turned out.

      1. Any feature of any communications device, from the Morse telegraph on, could become a back door. It is irrational to suddenly go into a blind panic about a bare possibility that anyone who understood the technology could have imagined decades ago. We should take steps to avoid the possibility from becoming a reality, certainly, but let’s not punish Apple for some possible future misapplication of technology they and their competitors have had for years and that Apple (if only Apple) has sought to use responsibly. This is not Majority Report.

        1. The difference–as company–a tech company…a company that’s supposedly championed security and privacy, is now entering into a private ownership realm to make a determination. Apple may own the servers, so their contents are scanned within the law. Now, the line of ownership and privacy is blurred for upholding the law/protecting children…on a device, in a realm, not owned by the “inspector.”

          How about using the same decision process to permit digi-assistants to scope for certain words, volume levels and other voice characteristics that relate to anger and are otherwise determined to infer assault/violence. Shall we permit that for safety and protection within a household?

          You note a ridiculous reality…of course a comm device “could” become a back door. Could, should, would…meaningless verbiage. Who cares about Apple competitors possessing the tech for yrs. MS had spreadsheets long before Apple. Find a better logical explanation. Apple has been a security/privacy vanguard and has distinguished themselves, greatly so in certain efforts. This is not simply a “blind panic.” This certainly smells like and can easily be interpreted as another tech company Big Brotherising what’s personal and belonging to simple ownership without intrusion.

          What Bureau do you work for?

          1. There is not a single thing in this new program that Apple could not have done five years ago. It did not abuse its users then, so where is the evidence that it is going to abuse them now? Yes, the technology could be abused, but it could have been abused all along. There is literally nothing new to see here, certainly nothing to inspire all this uproar.

            1. What a f’g naive answer…or at least following the proper govt script. Our country has been steadily moving towards control and determining the correct way to think, speak and act. I’m not talking about being lawful. Your mindset epitomizes the image of the frog in the soon to be boiling pot of water.

              “Scoring” here, like in China IS NOT a stretch…we are on that road. Remember the students that were requested, by VT’s governor, to make note of parents that arranged group visits during Thanksgiving? I’d call that “policing” of the abhorrent, nightmarish kind.

              How about the current admin’s “National Strategy for Countering Domestic Terrorism” program? Its purpose is to bring notice to those that might be “radicalizing.” Sounds great except these days, “radical” could mean espousing “freedom.” Literally. The American flag is radical because it is seen by many to be racist.

              It could also mean objecting to the mindset that affirms being racist is ok, as a giver of the proper medicine…call it karmic return, or justice. Have fun objecting to this illogical and irrational “justice”…when it’s being propelled in many sectors in our govt. One is sure to be “curated” by the media groups and it would be of no surprise that more significant blight would be experienced by those that deny it having anything to do with freedom, progress, or justice. Presently, MLK’s thinking is so last decade. Pure Kendi garbage.

              Apple is not (yet) a Twitter, a Google, or a FB in respect to being active “social scorers,” but it’s quite reasonable to have one’s spidey sense, react with vehemence against this latest Apple news. You equivocate with “it’s been done for yrs, everyone is already doing this/worse, Apple could have done this yrs ago.”

              Apple made a move from monitoring their property (servers) to eyeing someone’s personal device. They digressed in re: to privacy. How ANYONE can say that’s NOT concerning is beyond me. I react with logic, objectivity and with a dose of sorrow…not in blind panic.

              What Bureau do you work with?

            2. I know what you are saying, but I am hearing the echo of “This new technology has been developed by big business and the government to punish honest patriots for being hard-working citizens. So let’s throw our wooden shoes into these infernal looms.”

            3. What do you mean, “I know what you are saying?” Knowing your ways and means, it’s pacifying BS. Just to make the statement about big tech and the supposed overreach, adding a dose of condescension with the “wooden shoes” comment, or implying slippery slope is blind, ignorant or hyperbole is a sign that healthy skepticism is absent. We ARE moving towards scoring as I write.

              I say this knowing you had NO issues with what TWTR, GOOG and FB did in recent times. These Big Tech Media Businesses, working with Govt, can do IT…to those NOT in-line with the ‘proper views” (who said anything about hard working patriots?), there’s little reason to expect it to discontinue. In fact, statements coming from the Executive prove this to be the case…already mentioned; “National Strategy for Countering Domestic Terrorism” program. I like the flag…I’m likely on their list.

              Being patriotic is on the verge of being clumped with radical, btw. “Working” can also be associated with times past…naive, independent and likely capitalistic people…sure to be broadly radical in future times and considered radical by some in the US even now.

              Cut the condescension and stall the conflation.

              I’ll ask again, what Bureau are you linked with?

            4. Not linked with any Bureau. Just somebody who knows enough history to recognize that new technologies always inspire opposition from groups like the Frenchmen and Flemings who threw their shoes into the looms. The saboteurs lost, as did their Luddite allies in 19th century Britain. Ideas are like genies—you cannot get them back into the bottle once they are out. Efforts to stop change are rather like efforts to stop the tide, in that they end with being overwhelmed.

              I am all for managing the technology for image processing that is at issue here. That will be very important for protecting our rights going forward.

              However, any effort to ban a widely understood technology that has been around for years cannot possibly succeed. Keeping responsible actors like Apple from guiding the development of this technology will only put the worst possible parties in command. Given the choice between Apple and the Chinese, Russian or even American Government as the developer going forward, I know where my vote goes.

    2. It may be too much to expect that they understand that something that does not allow anyone but you (and your parents, if you are a minor) access to your device is not only not a backdoor, but not a door at all. Your iCloud account is not on your device, and that is the only thing Apple can access… if you give them access by agreeing to their end user agreement. If you don’t store any data on iCloud, it isn’t subject to access unless you give your consent in some other way.

      1. TxCCPUser is in full press damage control. We all know where he stands. And it’s amazing how TxLoser knows what apples internal intent is. It’s not just iCloud it’s also messaging, but Apple never said that’s where they will stop. It’s telling how you blindly trust/defend Apple. Defend the indefensible. Working with a criminal state that systematically murders people for profit; organ harvesting, slave labor, systematic genocide are all acceptable if they can hide it with sub contractors, besides it’s just a culture difference, and if it means getting them iPhones produced at a tidy 38% profit margin, then hey what is a CEO to do? We can’t expect them to stand on a moral ground when their P/E is at stake. it’s not like they have a lot of wiggle room when dealing with a criminal state, do you know how fast Apple could burn through that 2.4 trillion market value.

        If the CCP wants a back door then guess what?

          1. “We’ve said it before, and we’ll say it again now: it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.”

            “Apple is planning to build a backdoor into its data storage system and its messaging system.”

            You are on the other end of EFF’s in re: to “backdoor” and preserving an INDIVIDUAL’s privacy. I know you’re tempted to equate this with advocating child abuse. Control yourself…spouting such doesn’t serve you well.

            https://www.eff.org/deeplinks/2021/08/apples-plan-think-different-about-encryption-opens-backdoor-your-private-life

            1. Perhaps if you keep calling something that provides no access to data on your device to anyone who does not have your express permission a “backdoor” often enough, it will eventually become true.

              Now, perhaps if I call myself a billionaire often enough…

            2. If you have an iPhone, your device has had a client-side scanning system for years. Scanning for sexually explicit images is exactly the same technology as scanning for pictures of your Aunt Effie. Apple does not need to “make a well-intentioned effort to build such a system.” They already have the system and rolled it out years ago. I assume that similar systems for scanning and searching images are installed on virtually every device from virtually every other vendor on the planet.

              There is no break in encryption because the scanning is happening on your secure device. Are you under the impression that when you look at the pictures on your iPhone you are looking at encrypted images that your eyes somehow decrypt? If not, do you regard your device’s ability to display unencrypted pictures and text as a back door? Almost by definition, every encryption algorithm requires an unencrypted input.

              It is an utter red herring to present this as an invasion of privacy when you have as much—and arguably more—privacy if the scanning is happening on your secure device in your physical possession, rather than remotely on a massive server that you share with millions of other customers and that is maintained by thousands of company employees.

              Yet another example of fake news feeding mass hysteria.

          2. “Client side scanning system” (description quoted from EEF) is absolutely a backdoor. Apple has entered into the user’s realm. The OS is no longer software that merely offers functionality to the user, nor does it include parts/pieces that Apple can use to monitor technical matters to adjust if/when needed.

            The addition to the OS has nothing to do with either. It serves to monitor.

            Your retort is not even funny, let alone applicable/instructive.

            1. The term “backdoor” has an agreed meaning. The Oxford dictionaries define it as “a feature or defect of a computer system that allows surreptitious unauthorized access to data.”

              A company that puts out a press release to announce a feature is not acting surreptitiously. A feature that requires a conscious decision to opt in is not allowing unauthorized access to data. A process that occurs entirely on the user’s encrypted device is not allowing access to data at all.

              The EFF and Edward Snowden do not get to redefine “backdoor” as “anything we do not like.”

    3. Whatever, you must be a pretty young person to show so much naivete. First, you’re not understanding what back doors are. But, second, this is appalling as a FRONT door.

    4. You cannot reason with uneducated people, these are the same individuals whom will not get the vaccine, and will ultimately harm themselves and their family members – thinning of the herd is not a bad thing sometimes, sell your shares – I’ll add them to my mountain of shares already which are up 950% and growing daily – see you whiners.

  2. Headlines imply it will catch out people distributing child porn, but such people would not be using parental,controls and would presumably be savvy enough not to use iCloud either.

    I don’t think this is quite the big deal that some imagine it is.

    1. I would also assume that such individuals storing images on iCloud would have yet another layer of encryption on the files/folders before upload which would make Apple’s scanning ineffective. All done years before Apple even considered this ‘new’ feature.

      1. The explanation isn’t very clear, but it looks like the scanning/screening is only applied to images contained in the Photos database, not to independent image files elsewhere in the iCloud file structure.

        1. That somehow seems insufficient for what is being required of Apple under CSAM. But reading the description in the article above it does appear that the scanning is being limited to the images uploaded to “iCloud Photos” whether that is a single BLOB or image folder.

          1. While I am sure they are serious about protecting children, Apple’s major concern is not what people choose to store on their own devices, but what they might store on Apple’s servers. Some bad stuff is always going to seep through. To avoid prosecution for possession of child pornography, the company has to demonstrate that they have taken reasonable steps to avoid or eliminate child sexual abuse materials. They are hoping that this step will be enough and that Federal and state prosecutors will not require something far more intrusive. That threat is not just hanging over Apple’s head, but all of its customers.

  3. What’s happening to world? Seems like freedom is disappearing everywhere…when did big tech become an extension of the governments of the world?

    1. So you’ll go to other products that will do this and more, 10 times worse than Apple? You’ll get out of the frying pan into the fire? Smart move, doofski.

  4. Scanning for CSAM as Apple says is indeed a backdoor. What else are they concocting? To pick out CSAM, Apple scan entire image file (what else they pick up?) but it’s a good excuse that Apple mention CSAM as their motive to scan. Tim Cook always talk noble. But he so easily flips and flops on anything. He’s spineless. The only thing that disciplines him is his spreadsheet. I never trusted this character. Apple sold a lot of stuff on the promise of protecting consumers’ privacy. Really? I hope they are not milking us by creating false image. My Apple has now been a different company from what I thought it was. It’s stifling.

    1. Kent, They are scanning the entire photo file now. They have been doing that ever since they added the “People” feature to Photos (or iPhotos) years and years ago. Scanning for a match with a known child pornography image is actually easier than scanning for faces that match. That isn’t a deep dark secret, even if Apple had not sent out a press release describing exactly what it is doing.

      The only new things are: First, if an image in an iMessage to or from a minor appears to contain pornography, the minor’s parent will be notified, but only IF the parent has opted into that program. Adults, and children whose parents have not opted in, can exchange all the nasty pictures they want and nobody will ever know. The data will remain private on their device and will not be shared with anyone. That is hardly a “back door.” The kids might argue that it invades their privacy, but I think their parents would disagree.

      Second, before an image is uploaded from Photos to iCloud (which only ever happens with your consent), it will be scanned on your device for matches with known child sexual abuse materials. That substitutes for the checking for pornography that Apple (and every other cloud service) is already doing on the server side. Currently, Apple is scanning all your images (that’s how they index them for image search). After the change, they will only be scanning for pornography on the images that have been flagged by your device as presumptively pornographic, and only if several of your uploaded images have been similarly flagged. If you don’t use iCloud, nothing will be flagged and no data will leave your device except as, when, and how you choose. Again, that is not a “back door.”

      Neither of these programs impair end-to-end encryption, as some of the hysterical criticism has been claiming, since they only scan the unencrypted data on an individual device. Any communications with other devices will be E2E encrypted as normal.

      Yes, the technology poses the threat of abuse, but that threat has not materially changed since Apple (and Google) instituted image search capabilities years ago. It does not create a “back door,” although one certainly could be created, just as it already could have been created. It does not invade privacy, since it only affects folks who have opted in to voluntary programs. That could change, but the risk of change is no greater now than it has been for years.

  5. Apple totally messed up. I don’t want any tech company to scan content of my phone. Period. I am totally done with tech companies determining what I see, read and think. Thought Apple was different but they are not. As an Apple fan boy I will not hesistate to skip theor products as privacy is more important to me then fancy products.

  6. Apple is now officially on my lifetime shiat list. Linux here I come. Putting the iPhone up for sale, already got an Android, need a custom rom for that to screen out the crap. I have been a cheerleader and influenced many to buy Apple products, now I will do the opposite. Way to commit suicide Apple.

  7. “Apple is enabling governments worldwide to effectively have access to user data, which could go beyond what Apple is currently planning, as is the case with all backdoors.”

    YES. The law of clueless Leftist do gooders UNINTENDED CONSEQUENCES.

    “While the system is being purported to detect child sex abuse, it could be adapted to scan for other text and imagery without user knowledge.”

    Cook fought tooth and nail against the Trump administration opening ONE PHONE to catch terrorists.

    But now during the Biden administration Cook’s total course reversal all in the typical Demonrat tactic it’s all for the children. Yes, they need a holier than thou reason to spy on users little different than Big Tech. Notorious for reading and monitoring posts then enforcing censorship when it does not follow Leftist orthodox religion.

    Yes, Cook threw iPhone users security and privacy under the bus.

    For breach of contract — FIRE COOK!…

  8. “Apple’s current path threatens to undermine decades of work by technologists, academics and policy advocates towards strong privacy-preserving measures being the norm across a majority of consumer electronic devices and use cases,” the letter reads.

    “We ask that Apple reconsider its technology rollout, lest it undo that important work.”
    They accused the project of lacking enough independent oversight to fend off eventual abuse and “unreasonable expansion” of surveillance.

    The signatories are demanding Apple immediately halt the deployment of the iPhone monitoring ability and come forward publicly with a statement promising to commit to user privacy.  

    The letter was signed by prestigious institutions like The New York Public Library, Freedom of the Press Foundation and the Privacy Foundation.

    Full story here: https://www.the-sun.com/news/3464672/apple-revolt-tech-scan-customers-iphones-child-abuse-privacy/

    Bottom line: Mr. Cook I don’t want Apple to continuously scan my iPhone, reversal from security standards you preached for years, for ANY holy name SJW cause…

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.