EFF petition lets users speak out against Apple’s mass surveillance plans

If you’re concerned about Apple’s recent announcement that the next version of iOS will install a mass surveillance backdoor into Apple devices, The Electronic Frontier Foundation (EEF) has created a petition to let users speak out.

iPhone backdoor

Electronic Frontier Foundation:

Mass surveillance is not an acceptable crime-fighting strategy, no matter how well-intentioned the spying. If you’re upset about Apple’s recent announcement that the next version of iOS will install surveillance software in every iPhone, we need you to speak out about it.

SIGN THE PETITION
Tell Apple: Don’t Scan Our Phones

Last year, EFF supporters spoke out and stopped the EARN IT bill, a government scheme that could have enabled the scanning of every message online. We need to harness that same energy to let Apple know that its plan to enable the scanning of photos on every iPhone is unacceptable.

Apple plans to install two scanning systems on all of its phones. One system will scan photos uploaded to iCloud and compare them to a database of child abuse images maintained by various entities, including the National Center for Missing and Exploited Children (NCMEC), a quasi-governmental agency created by Congress to help law enforcement investigate crimes against children. The other system, which operates when parents opt into it, will examine iMessages sent by minors and compare them to an algorithm that looks for any type of “sexually explicit” material. If an explicit image is detected, the phone will notify either the user and possibly the user’s parent, depending on age.

These combined systems are a danger to our privacy and security. The iPhone scanning harms privacy for all iCloud photo users, continuously scanning user photos to compare them to a secret government-created database of child abuse images. The parental notification scanner uses on-device machine learning to scan messages, then informs a third party, which breaks the promise of end-to-end encryption.

Apple’s surveillance plans don’t account for abusive parents, much less authoritarian governments that will push to expand it. Don’t let Apple betray its users.

SIGN THE PETITION
Tell Apple: Don’t Scan Our Phones

MacDailyNews Take: The EFF petition, which we have signed and recommend that our readers sign as well, reads as follows:

Don’t Scan Our Phones

The “child safety” changes Apple plans to install on iOS 15 and macOS Monterey undermine user privacy, and break the promise of end-to-end encryption.

I urge Apple to reconsider these systems. Continuous scanning of images won’t make kids safer, and may well put more of them in danger. Installing the photo-scanning software on our phones will spur governments around the world to ask for more surveillance and censorship abilities than they already have.

Sincerely,

SIGNED
Your Name

SIGN THE PETITION
Tell Apple: Don’t Scan Our Phones

6 Comments

  1. I wonder what the FBI, etc. have on Tim Cook, or have manufactured against Tim Cook, to make him do such a 180-degree about face on privacy, throwing away Apple’s privacy reputation and all the money invested in marketing privacy over so many years.

    For Apple to stoop to using the “Think of the Children” crutch for introducing mass surveillance into their products, whatever the FBI has against Cook must be a doozy!

    Either that, or Tim Cook is just a spineless hypocrite who folded like a cheap suit.

  2. I wonder if my hesitancy (fear) to attach my name to such a document, while being skeptical of Apple’s move, points to a sense/awareness and fear of the growing recon state?

    I’ll say, “si.”

  3. Hint: Find and listen to very smart software guru Craig Federighi’s brilliant explanation of how Apple’s clever new AI systems work to help prevent sick-fuck psycho child porn ‘pervs’ from being able to store their ‘pervy turn-on’ photos in the encrypted safe haven known as iCloud.

    As a software engineer I’ve implemented temporary “back doors” in software during the development and testing phases to enable faster observation and evaluation turnaround. By my definition, Apple’s ‘comparison’ system IS NOT AND NEVER WILL BE A BACK DOOR and NO OUTSIDER IS LOOKING AT PHOTOS located on one’s iPhone.

    It’s a 2-part software system [local + global] that automatically pre-compares HASH FUNCTIONS computed for each photo that the iPhone owner wants to store in iCloud; before they are sent to iCloud. That’s part 1 and occurs on the iPhone. If one’s photos are not going to iCloud then this does not occur. REMEMBER: there is no requirement that one has to store one’s photos in iCloud … it’s strictly optional and could be extra-cost. [BTW, for those not aware, in this case hash functions are numerical representations of photo sources, also referred to as a hash or hashes.]

    There are KNOWN ‘pervy’ photo HASHES that are internationally recognized. If one optionally wants to store a photo in iCloud, then — and only then — that photo is converted into a hash and compared AS A HASH to the known ‘pervy’ hashes. If there are no hash matches, the photo is encrypted and sent immediately to iCloud for storage. However, hashes that match one or more of the known ‘pervy’ hashes are marked as POTENTIALLY illegal before they are sent to the iCloud IN AN UNENCRYPTED FORM along with their hashes. They enter hash purgatory.

    At Apple’s iCloud servers [part 2], only photos with hashes marked as potentially ‘pervy’ are again run through that part of the comparison system as a double check. If the same ‘pervy’ hash matches one or more of the 30+ known ‘pervy’ hashes, the actual unencrypted potentially ‘pervy’ photos are then sent to live humans for examination and disposition.

    Those live humans at Apple (or perhaps at a contractor) could be the weak link in this system if they are religious fanatics or have axes to grind. It would be best to have a ‘jury’ of 5 to 7 folks examine all flagged pics to ensure that no one individual is ‘calling the cops’ and putting innocent and/or naive people through hell unnecessarily.

    Since Apple owns iCloud it’s their RIGHT to ensure that NO ‘pervy’ photos are stored on their servers in an encrypted form. Unfortunately there are currently probably thousands of ‘pervy’ photos already encrypted in iCloud from years past. These cannot be located by Apple — because Apple doesn’t have the decryption keys — and so the sick-fucks that are storing them there can rest easy at this point, but they’ll be prevented from adding to their smarmy collections in the future.

    Since Apple controls the software OSs that go onto iPhones, and Apple owns the iCloud and its management software, and all comparisons are done using hashes, there’s little chance of some outsiders modifying the systems. And you can bet that Apple will be testing all sorts of near-pervy pics to ensure that their AI-based systems do not overreact to them to become the bases for embarrassing lawsuits.

    Back in the 35mm camera days when we had to send our film rolls to Kodak in Rochester NY for developing and conversion into slides, I was astounded one day to receive a warning because one of the photos was considered lewd by someone at Kodak and so they would not develop it into a slide (but they returned the negative). I had walked into front the hall with my camera as my 10-year old son was sliding down the hall bannister and when he hit the carved wooden pillar at the bottom I snapped a pic. Apparently the person scanning the slides as they were being produced decided that his holding onto an apparently ‘giant erect phallic symbol’ between his legs (the bannister), was judged to be visually unacceptable. To this day, 40 years later, that photo is still somewhere in negative form. So, before digital smartphone photography, photo censorship was extensively practiced by commercial developers, as was their right.

    Apple’s new AI-neural TEXT scanning technology that it has devised and also announced turns out to be a very clever aid for parents to prevent their young kids from getting ‘dick & pussy pics’ from ‘pervy’ adults (or naive cohorts). PARENTS would have the software activated on their children’s iPhones and, should a d&p pic arrive in a text message on their childrens iPhones, it would be blurred out and unrecognizable. A decision would then have to be made by the young recipient to either open the pic (via slide switch) — and then know that a copy of the unblurred ‘perv’ text with pic would be automatically sent to their parents — OR to delete the text altogether; the better alternative. Consenting adults who enjoy sending d&p pics back and forth to one another would, of course, NOT activate the text scanning option, and therefore continue to marvel at each other’s fitness. Adults who don’t want to receive such material from anyone should also have the scanning option activated. IMPORTANTLY, the text scanning function occurs ON THE IPHONE, not by Apple during transmission.

    If he were alive today, I’m convinced that Steve Jobs would stand strongly for these kinds of new ‘weeding’ software apps and be proud that his company was doing something about the growing threats to children.

    1. The problem isn’t what it’s doing now, it’s what governments will demand it do in the future, genius.

      Privacy means people know what they’re signing up for, in plain English and repeatedly. I believe people are smart and some people want to share more data than other people do. Ask them. Ask them every time. Make them tell you to stop asking them if they get tired of your asking them. Let them know precisely what you’re going to do with their data. — Steve Jobs

      Nearly 100 civil rights groups worldwide call on Apple to abandon CSAM iPhone surveillance

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.