Apple’s ill-considered iPhone backdoor has employees speaking out internally

The backlash over Apple’s ill-considered move to activate an iPhone backdoor ostensibly to scan U.S. customers’ phones and computers for Child Sexual Abuse Material (CSAM) has grown to include employees speaking out internally, as well as generating intensified protests from leading technology policy groups.

Apple's ill-considered iPhone backdoor has employees speaking out internally

Joseph Menn and Julia Love for Reuters:

Apple employees have flooded an Apple internal Slack channel with more than 800 messages on the plan announced a week ago, workers who asked not to be identified told Reuters. Many expressed worries that the feature could be exploited by repressive governments looking to find other material for censorship or arrests, according to workers who saw the days-long thread.

Past security changes at Apple have also prompted concern among employees, but the volume and duration of the new debate is surprising, the workers said. Some posters worried that Apple is damaging its leading reputation for protecting privacy.

MacDailyNews Take: Of course the idea of activating an iPhone backdoor has damaged and is continuing to damage Apple’s reputation for protecting privacy.

When MacDailyNews staff, after examining Apple’s scheme, goes searching for “most secure smartphones” in order to replace their Apple iPhones, you’ve got issues.

Apple spent untold millions promoting “privacy” over a period of many years, only to squander it in one fell swoop.

Last week’s announcement is drawing heavier criticism from past outside supporters who say Apple is rejecting a history of well-marketed privacy fights.

People familiar with the matter said a coalition of policy groups are finalizing a letter of protest to send to Apple within days demanding a suspension of the plan. Two groups, the Electronic Frontier Foundation (EFF) and the Center for Democracy and Technology (CDT) both released newly detailed objections to Apple’s plan in the past 24 hours.

“What Apple is showing with their announcement last week is that there are technical weaknesses that they are willing to build in,” CDT project director Emma Llanso said in an interview. “It seems so out of step from everything that they had previously been saying and doing.”

Apple declined to comment for this story. It has said it will refuse requests from governments to use the system to check phones for anything other than illegal child sexual abuse material.

MacDailyNews Take: Bullshit.

When Apple removed VPN apps – or The New York Times app, or whatever app or service (Apple Books, for one example) the CCP demanded – from the App Store or Apple devices in China, the company told the developers of ExpressVPN, one of the apps affected, that its app was removed because “it includes content that is illegal in China.” In a subsequent statement, Apple said: “Earlier this year China’s MIIT announced that all developers offering VPNs must obtain a license from the government. We have been required to remove some VPN apps in China that do not meet the new regulations.”

Apple, which famously explains away the banning of apps in certain countries by saying that the company must comply with all local laws, might refuse requests without accompanying laws, but, based on a long history, will not refuse requests based on passed laws from governments to use the system to check phones for whatever oppressive governments come up with to monitor, categorize, and punish citizens who do not comply with official doctrine.

Apple, which, under Tim Cook, has attached itself to the China teat so firmly as to be hopelessly compromised, is lying by omission. Yes, they’ll refuse a “request,” but what will they do when it becomes law to scan for political, religious, sexual orientation, etc. materials? Apple will follow the local laws, as they have in every known case of app banning/service removal.

Apple should never introduce this system into China because Apple is not going to stand up to the CCP when the CCP could severely hamper or even shut down Apple’s business in a Bejing minute.

If the U.S., the first country where this scanning is planned, passes a law (for example, to scan for “terrorists,” using a broad definition, of course, which would allow for scanning for virtually anything in every photo on the device), history shows that Apple will follow the law.

Apple will refuse requests from governments until it becomes law. Then, as usual, Apple will follow the laws in each country where it operates.

…Apple has made other technical decisions that help authorities, including dropping a plan here to encrypt widely used iCloud backups and agreeing to store Chinese user data in that country.

A fundamental problem with Apple’s new plan on scanning child abuse images, critics said, is that the company is making cautious policy decisions that it can be forced to change, now that the capability is there, in exactly the same way it warned would happen if it broke into the terrorism suspect’s phone.

Apple says it will scan only in the United States and other countries to be added one by one, only when images are set to be uploaded to iCloud, and only for images that have been identified by the National Center for Exploited and Missing Children and a small number of other groups.

MacDailyNews Take: Whoever controls the database, or infiltrates the database, owns 1+ billion devices thanks to Apple’s iPhone backdoor.

Apple Inc., “A Message to Our Customers,” February 16, 2016:

We have great respect for the professionals at the FBI, and we believe their intentions are good.

[Ah, to be so naive. Ignorance is indeed bliss. But, we digress – MDN Ed.]

Apple continued:

Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.

Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.

The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.

Uh, yeah. Duh.

Apple continued:

We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.

Gee, ya think?

Apple concluded:

The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.

Ouch.

And, who, exactly, are these “small number of other groups?” Since Apple is stupidly admitting them into users’ devices to scan their photos without users’ permission, the company should not only list this mystery “small number of other groups,” they should also explain how each of their databases are perfectly secure and hack-proof, including by state actors.

Good luck with that.

But any country’s legislature or courts could demand that any one of those elements be expanded, and some of those nations, such as China, represent enormous and hard to refuse markets, critics said.

Police and other agencies will cite recent laws requiring “technical assistance” in investigating crimes, including in the United Kingdom and Australia, to press Apple to expand this new capability, the EFF said.

“The infrastructure needed to roll out Apple’s proposed changes makes it harder to say that additional surveillance is not technically feasible,” wrote EFF General Counsel Kurt Opsahl.

MacDailyNews Take: We oppose Apple’s ill-considered move to activate an iPhone backdoor intended to scan customers’ devices without their permission and have signed the Open Letter Against Apple’s Privacy-Invasive Content Scanning Technology which asks that:

1. Apple Inc.’s deployment of its proposed content monitoring technology is halted immediately.
2. Apple Inc. issue a statement reaffirming their commitment to end-to-end encryption and to user privacy.

More info and link to the open letter against Apple’s privacy-invasive content scanning scheme here.

26 Comments

  1. As per my comment from Monday, August 9:

    Who controls the NCMEC database? How secure is the NCMEC database?

    Whoever controls the NCMEC database controls 1+ billion iPhone users.

    Did Tim Cook ever take even a basic history course when he was studying to become a Compaq parts orderer who’d one day cajole a sick, dying Steve Jobs into making yet another bad CEO pick?

      1. Once they broke Steve Jobs privacy promise of “inform the user, ask them for permission, repeatedly” by installing a scanning back door, I gave up on them.

        Unless apple turns about, says we made a mistake, we wont be installing a scanning back door, and never will. Apple is dead to me.

        Steve Jobs was good at this. When he made a mistake with the DVD and then did an about face and put in CDROM burners for what ended up being their successful rip mix burn campaign, he showed flexibility. Moron Cook is incapable of doing other than being an idiot plastic cue card reader.

        F apple. I just dont care anymore.

    1. This is a gross violation of Apple’s privacy values and of human rights, but to be fair the database security (inserting something into the database) argument is weak. The images are flagged for review by an Apple person. So it’s not like the database automatically triggers the involvement of law enforcement or a release of an image to a party other than to Apple.

      1. As if “an Apple person” or the dweeb who controls the NCMEC and the “small number of other groups'” databases cannot be bought off or have evidence planted against them that forces them to comply to “national security” forces.

        Apple stupidly wants to create a tunnel into iPhone. With iOS 15, the iPhone will only be as secure as the security – or insecurity – of that tunnel.

        Other people now control the tunnel into your iPhone. People can and will be bought, blackmailed, complicit, etc.

        Apple gave $1 million to the Southern Poverty law Center – a scam leftist front – and banned Parler for the same reasons as they should have banned Twitter, but didn’t, of course. 91% of contributions by Apple employees have gone to Democrats vs. 9% to Republicans since 2004.

        Tell me that the average leftist CNN/MSNBC-brainwashed socialist Apple Silicon Valley Kalifornia employee wouldn’t want to help the saintly Biden administration scan for images of people walking inside velvet ropes and posing for selfies inside the Capitol Building on January 6th during the oh-so-catastrophic, makes-the-holocaust-pale-in-comparison “insurrection?”

  2. Perhaps do not use an iPhone to take any pictures anymore when this begins. Buy a digital SLR–it will take better pictures, anyway. Privacy and picture quality outweigh the convenience of the little thing in your pocket.

    1. So, Apple has taken one of their top iPhone selling points – the camera system, upon which they’ve invested hundreds of millions in R&D over many years – off the table for potential customer.

      Tim Cook is so smart.

  3. FIRE those who advised Tim Cook this was a good idea.

    The can’t claim they won’t cave to government requests when the very organization they’re advocating for is partially funded by the FBI.

    The closing phrase in the movies ‘War Games’ said it best…Strange game, the only winning move is not to play, how about a nice game of chess?

  4. Check…

    CSAM Detection
    Technical Summary
    August 2021

    Click to access CSAM_Detection_Technical_Summary.pdf

    Supposedly Apple can only see the images after they have been flagged by the system. If they suddenly come up with images that don’t appear to be children, they can opt not to report them to the authorities, Also there has to be a threshold or number of suspicious entries in an iCloud library before the account is reported.

    I “agree” with Apple’s assessment of the security of their system. I do not believe that it qualifies as a “backdoor.” I STILL DISAGREE WITH IMPLEMENTING IT, and suspect the use of the term “backdoor” will help to convince them not to.

    It also seems like it is easy to avoid the system.

    First of all, if you are someone trafficking in such images, it would seem simple to just not use the photo library on your phone. Especially do not use iCloud photos. According to the CSAM Technical Summary PDF published by Apple:

    “…Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child-safety organizations… Before an image is stored in iCloud Photos, on-device matching process is performed for that image against the database of known CSAM hashes…”

    You could have a ton of horrible images stored on your phone, but Apple won’t look at them until you attempt to transfer them to your iCloud database,

    The document repeatedly states that images are only scanned before being copied to the iCloud database. So seems first thing you do is go to the Apple ID System Preferences Panel and turn off Photos. Right?

    You can store your images on any number of other online services.

    Though I wonder how many services like DROPBOX are already doing this?

    Even if you can avoid the system, it shouldn’t be implemented. I’ve written a courteous letter to Tim Cook saying why I object to the system being implemented, not the least of which is that my beloved device feels less loved and more, I dunno… dirty.

    1. Totally understand what you’re saying.

      The issues for me are:

      This technology will be abused by governments. As MDN says, Apple will refuse a “request” but obey a law to apply this technology to other use-cases. Apple can’t be compelled to invent a whole new technology (compelled speech) but they can be forced to apply a technology they’ve created in a slightly new way. Apple opened this pandora’s box.
      These technologies are so very personal. Knowing your phone is silently scanning and evaluating the content you generate will change the way you think about the technology. To a generation that uses digital technology the way we all used libraries and diaries and sketchbooks, it’s an erosion of the privacy of your own mind and creations. It’s an erosion of self.

      1. Your iPhone has been silently scanning and evaluating all of your photos for years, identifying people, places and your pets, and serving you up “memories”.

        Funny that folks didn’t complain about that……

        1. The past does never nullify current concerns. Also, growing awareness resulting in improved understanding is a good thing. .

          These techniques also did not report anything outside of your iPhone. It is not possible for Apple to scan your images in their iCloud with the contacts on your device for example.

          So it is perfectly normal that there has not been the amount and intensity of concerns until now. I don’t see anything contradictory there, only healthy evolving of these kind of debates.

  5. As I’ve often stated with a dose of sadness, the fascinating and intriguing Story that once characterized Apple is complete. Tim is writing a great financial story, but the result is big, pedestrian and veering towards the oligopolistic.

  6. It is good the hear that also inside Apple people are not falling for the false pretext of Think of the Children, as it is very easy to see that that is a bogus argument.

    In the current situation images are checked server side, and out of a 100 images the for example 2 wrong images may be detected.
    In the new situation images are checked client side, and out of the same 100 images 2 will be flagged and get reported when uploaded.

    Same result. The needle in the fight against child abuse hasn’t moved’ one bit. Thus this clearly can not be the motive.

    Aside from this, to my knowledge child abusers – and all other sorts of horrible criminals – do their stuff on the dark web, not on the www as we know it.
    If I read about law enforcement here recruiting for people with significant IT-skills, then these are hacking skills, encryption skills, knowledge about dark web, dark trade in narcotics, guns, and so on. Because that is where they need to investigate. They do not advertise for people who know about something as irrelevant as iCloud Photos.

    The false pretext is so obvious, it is almost shameless.

  7. Apple now is the iCommunist Phone. It will report all your data to China’s CCP leadership. Tim Cook reports to some underling of the Chinese Communist Party. China, and it’s US Democrat Party subordinates, won’t need a backdoor. Tim will give them keys to the front door, the sliders, the windows, the safety deposit box, the children’s piggy banks, and your bedroom. Tim has killed the goose that laid the golden eggs. Tim is officially now the Anti Steve Jobs.

  8. This Will Destroy Apple Tim!!!!!!
    You are in essence plugging in a surveillance spyware at the heart of Apples Platform (which can and will be exploited ! )

    It is perplexing that Apple even considered it let alone bring it this far..

    Apple will become exactly the OPPOSITE of what it has been preaching….the mantras of SECURE AND PRIVET that everyone believed in and was willing to pay a premium for will be flushed down the toilet !

    You are breaking not only the Security /Privacy paradigm but also The Apple’s Ecosys at the same time Tim!

    What is left to distinguish Apple???? Quality???…Apple is no longer alone in that department!

    There will be no way back once that trust and faith is broken!

    And to start with you are not solving anything, nothing!! As those engaging in these activities, activities you are willing to break Apple for, can easily circumvent it!

    DONT DO IT !!!

    I cant believe whats going on .. very hard to swallow!!!!

    (In a certain corner of my gut i feel much bigger forces are involved here than just Apple’s/Tim’s so called virtuous ambition.)

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.