Apple warns retail and online employees to be ready for iPhone backdoor questions

Apple has warned retail and online sales staff to be ready to field questions from consumers about the activation of an iPhone backdoor designed ostensibly to scan U.S. customers’ phones and computers for Child Sexual Abuse Material (CSAM). A growing backlash which includes employees speaking out internally, as well as generating intensified protests from leading technology policy groups.

Apple warns retail and online employees to be ready for iPhone backdoor questions

Mark Gurman for Bloomberg News:

In a memo to employees this week, the company asked staff to review a frequently asked questions document about the new safeguards, which are meant to detect sexually explicit images of children.

MacDailyNews Take: We reviewed Apple’s hastily-rolled-out, after-the-fact, damage control FAQ and one question sprang immediately to mind.

Apple: “Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it.”

Q: What will Apple do if a government passes a law to look for other types of images (crime, terrorism, political, religious, sexual orientation, etc.) when Apple has a long history of complying with all local laws? Apple might not accede to any government’s request without accompanying laws, but, based on a long history, will Apple refuse requests based on passed laws?

The tech giant also said it will address privacy concerns by having an independent auditor review the system…

Apple previously said it would refuse any requests from governments to utilize its technology as a means to spy on users.

MacDailyNews Take: Again, Apple might refuse government requests, but the company has a proven history of complying with local laws, which is why privacy advocates are raising the alarm; mission creep is a real concern here.

Once Apple’s conduit exists, governments will work to enact laws that utilize the opening Apple wants to provide here. With history as our guide, Apple will then accede.

The road to hell is paved with good intentions.

We oppose Apple’s ill-considered move to activate an iPhone backdoor intended to scan customers’ devices without their permission and have signed the Open Letter Against Apple’s Privacy-Invasive Content Scanning Technology which asks that:

1. Apple Inc.’s deployment of its proposed content monitoring technology is halted immediately.
2. Apple Inc. issue a statement reaffirming their commitment to end-to-end encryption and to user privacy.

More info and link to the open letter against Apple’s privacy-invasive content scanning scheme here.

(By the way, whoever concocted and approved the tone deaf rollout of this PR disaster for Apple… ay yi yi!)

13 Comments

  1. Somebody important at Apple must have been on summer vacation for this to get approved and launched like this – what a clusterfsck!

    Not only was the rollout awful, but the concept – as labored as it is – is ripe for mission creep.

    Actual pedophile kiddie porn criminals are not going to use Apple’s iCloud. They’re not using it now. And Apple and everyone with an IQ above 80 knows this. CSAM is merely the pretext for launching this surveillance state system. “Think of the Children” is the trojan horse.

    Now Apple has proclaimed publicly that photos on iCloud will be scanned (and compared with hashes of known illegal images), so it’s like gun control in that the only ones who will suffer are law-abiding persons who will have their rights impinged since, duh, criminals by definition do not follow the law.

    So, now, all Apple customers are, in effect, treated as criminals, and must have their photos in iCloud checked for some infinitesimal percentage of bad actors.

    It’s a bad, bad look for Apple, regardless of their intentions and the potential for future abuse by oppressive governments worldwide is huge.

    1. Your right about the pedophile thing…most are not going to use a public storage services, these people are immoral not necessarily stupid, they’ve already planned out how to local storage options they can encrypt. That’s how they get away with this. The more you think about it, the more nuts Apple plan is.

    2. Clever use of “fsck”… I remember making a joke about that a couple decades back. Nice to see someone else does, too.

      You’re totally right about 4A concerns. The government doesn’t have to lift a finger and Silicon Valley implements 1984 on steroids.

  2. In the current situation images are checked server side, and out of a 100 images the for example 2 wrong images may be detected.
    In the new situation images are checked client side, and out of the same 100 images 2 will be flagged and get reported when uploaded.

    Same result. The needle in the fight against child abuse hasn’t moved’ one bit. Thus this clearly can not be the motive.

    Aside from this, to my knowledge child abusers – and all other sorts of horrible criminals – do their stuff on the dark web, not on the www as we know it.
    If I read about law enforcement here recruiting for people with significant IT-skills, then these are hacking skills, encryption skills, knowledge about dark web, dark trade in narcotics, guns, and so on. Because that is where they need to investigate. They do not advertise for people who know about something as irrelevant as iCloud Photos.

    The false pretext is so obvious, it is almost shameless.

    (copied form a reply I made in another thread)

  3. Apple already scans all our photos. How do you think they produce the ‘For You’ and ‘Memories’ features. It’s just an extension of what they are already doing. Calm down everyone.

    1. The fact that knives are already used for cutting meat and cleaning your nails does not mean you can use it to stab other people, for example. Every reason to be alerted in that case.

      The concern is not about the technique as such, but about it’s intended use and the moving goalposts and mission creep that go alongside that presumed intended use.

  4. Stop using iCloud (or any Cloud service for that matter) and,
    Stop upgrading your iPhone and Mac in protest. Just upgrade when you absolutely have to.

    This is the best consumer protest you can mount and it’s perfectly legal.

    In Australia the government mandated that ISPs have to keep the all individual internet data for two years and it’s then passed onto government agencies. It started with terrorism, then pedophile networks and then pedophiles, then fraud and they go stopped by the Australia senate when they were talking about copyright infringement.

    Trust me, this is how governments operate. I don’t trust governments to do the right thing and I don’t trust companies that sidle up to them.

  5. Scanning pics in the cloud is one thing. I can opt in or out of that service. Scanning them on the phone is creepy and treats everyone like a pedophile. Pedophiles don’t use a cloud service. They are evil people and messed up in the head but not necessarily stupid. I will cancel my iCloud service and not buy anymore iPhones. I have been thinking of just buying used once’s anyway as a lame attempt at protesting Apples use of slave labor in China. I figured the phone was already made and buying a used one does not help Apple at all.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.