Nearly 100 civil rights groups worldwide call on Apple to abandon CSAM iPhone surveillance

Nearly 100 civil rights groups worldwide have signed an open letter objecting to Apple’s plan to institute iPhone surveillance capabilities, asking Apple to abandon its ill-conceived scheme for CSAM scanning. The letter spotlights the risk of misuse by governments around the world.

iPhone backdoor

On August 5, 2021, Apple announced that it will be installing surveillance software that will conduct on-device scanning in Messages and photos. As the coalition’s letter explains, although the new features are designed to protect children and reduce the spread of child sexual abuse material (CSAM), they will create new risks for children and could be used to censor speech and threaten the privacy and security of people around the world.

In particular:

• The scan and alert feature in Messages could result in alerts that threaten the safety and wellbeing of some young people, and LGBTQ+ youths with unsympathetic parents are particularly at risk.
• Once the CSAM hash scanning for photos is built into Apple products, the company will face enormous pressure, and possibly legal requirements, from governments around the world to scan for all sorts of images that the governments find objectionable.

The breadth of the international coalition joining the letter demonstrates the extent to which Apple’s plans open the door to threats to human rights across the globe.

Center for Democracy & Technology:

19 August 2021

Tim Cook
CEO, Apple, Inc.

Dear Mr. Cook:

The undersigned organisations committed to civil rights, human rights, and digital rights around the world are writing to urge Apple to abandon the plans it announced on 5 August 2021 to build surveillance capabilities into iPhones, iPads, and other Apple products. Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material (CSAM), we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.

Apple announced that it is deploying a machine learning algorithm to scan images in its text messaging service, Messages, to detect sexually explicit material sent to or from people identified as children on family accounts. This surveillance capability will be built right into Apple devices. When the algorithm detects a sexually explicit image, it warns the user that the image may be sensitive. It also sends a notice to the organiser of a family account whenever a user under age 13 chooses to send or to receive the image.

Algorithms designed to detect sexually explicit material are notoriously unreliable. They are prone to mistakenly flag art, health information, educational resources, advocacy messages, and other imagery. Children’s rights to send and receive such information are protected in the U.N. Convention on the Rights of the Child. Moreover, the system Apple has developed assumes that the “parent” and “child” accounts involved actually belong to an adult who is the parent of a child, and that those individuals have a healthy relationship. This may not always be the case; an abusive adult may be the organiser of the account, and the consequences of parental notification could threaten the child’s safety and wellbeing. LGBTQ+ youths on family accounts with unsympathetic parents are particularly at risk. As a result of this change, iMessages will no longer provide confidentiality and privacy to those users through an end-to-end encrypted messaging system in which only the sender and intended recipients have access to the information sent. Once this backdoor feature is built in, governments could compel Apple to extend notification to other accounts, and to detect images that are objectionable for reasons other than being sexually explicit.

Apple also announced that it would build into the operating system of its products a hash database of CSAM images provided by the National Center for Missing and Exploited Children in the United States and other child safety organisations. It will scan against that database every photo its users upload to iCloud. When a preset threshold number of matches is met, it will disable the account and report the user and those images to authorities. Many users routinely upload the photos they take to iCloud. For these users, image surveillance is not something they can opt out of; it will be built into their iPhone or other Apple device, and into their iCloud account.

Once this capability is built into Apple products, the company and its competitors will face enormous pressure — and potentially legal requirements — from governments around the world to scan photos not just for CSAM, but also for other images a government finds objectionable. Those images may be of human rights abuses, political protests, images companies have tagged as “terrorist” or violent extremist content, or even unflattering images of the very politicians who will pressure the company to scan for them. And that pressure could extend to all images stored on the device, not just those uploaded to iCloud. Thus, Apple will have laid the foundation for censorship, surveillance, and persecution on a global basis.

We support efforts to protect children and stand firmly against the proliferation of CSAM. But the changes that Apple has announced put children and its other users at risk, both now and in the future. We urge Apple to abandon those changes and to reaffirm the company’s commitment to protecting its users with end-to-end encryption. We also urge Apple to more regularly consult with civil society groups, and with vulnerable communities who may be disproportionately impacted by changes to its products and services.

Access Now (Global)
Advocacy for Principled Action in Government (United States) African Academic Network on Internet Policy (Africa)
AJIF (Nigeria)
American Civil Liberties Union (United States)
Aqualtune Lab (Brasil)
Asociación por los Derechos Civiles (ADC) (Argentina) Association for Progressive Communications (APC) (Global) Barracón Digital (Honduras)
Beyond Saving Lives Foundation (Africa)
Big Brother Watch (United Kingdom)
Body & Data (Nepal)
Canadian Civil Liberties Association
CAPÍTULO GUATEMALA DE INTERNET SOCIETY (Guatemala) Center for Democracy & Technology (United States)
Centre for Free Expression (Canada)
CILIP/ Bürgerrechte & Polizei (Germany)
Código Sur (Centroamerica)
Community NetHUBs Africa
Dangerous Speech Project (United States)
Defending Rights & Dissent (United States)
Demand Progress Education Fund (United States)
Derechos Digitales (Latin America)
Digital Rights Foundation (Pakistan)
Digital Rights Watch (Australia)
DNS Africa Online (Africa)
Electronic Frontier Foundation (United States) EngageMedia (Asia-Pacific)
Eticas Foundation (Spain)
European Center for Not-for-Profit Law (ECNL) (Europe)
Fight for the Future (United States)
Free Speech Coalition Inc. (FSC) (United States)
Fundación Karisma (Colombia)
Global Forum for Media Development (GFMD) (Belgium)
Global Partners Digital (United Kingdom)
Global Voices (Netherlands)
Hiperderecho (Peru)
Instituto Beta: Internet & Democracia – IBIDEM (Brazil)
Instituto de Referência em Internet e Sociedade – IRIS (Brazil) Instituto Liberdade Digital – ILD (Brazil)
Instituto Nupef (Brazil)
Internet Governance Project, Georgia Institute of Technology (Global) Internet Society Panama Chapter
Interpeer Project (Germany)
IP.rec – Law and Technology Research Institute of Recife (Brazil) IPANDETEC Central America
ISOC Bolivia
ISOC Brazil – Brazilian Chapter of the Internet Society
ISOC Chapter Dominican Republic
ISOC Ghana
ISOC India Hyderabad Chapter
ISOC Paraguay Chapter
ISOC Senegal Chapter
JCA-NET (Japan)
Kijiji Yeetu (Kenya)
LGBT Technology Partnership & Institute (United States)
Liberty (United Kingdom) (EU/DE)
May First Movement Technology (United States)
National Coalition Against Censorship (United States)
National Working Positive Coalition (United States)
New America’s Open Technology Institute (United States)
OhmTel Ltda (Columbia)
OpenMedia (Canada/United States)
Paradigm Initiative (PIN) (Africa)
PDX Privacy (United States)
PEN America (Global)
Privacy International (Global)
Progressive Technology Project (United States)
Prostasia Foundation (United States)
R3D: Red en Defensa de los Derechos Digitales (Mexico)
Ranking Digital Rights (United States)
S.T.O.P. – Surveillance Technology Oversight Project (United States) Samuelson-Glushko Canadian Internet Policy & Public Interest Clinic (CIPPIC) Sero Project (United States)
Simply Secure (United States)
Software Freedom Law Center, India
SWOP Behind Bars (United States)
Tech for Good Asia (Hong Kong)
TEDIC (Paraguay)
Telangana (India)
The DKT Liberty Project (United States)
The Sex Workers Project of the Urban Justice Center (United States)
The Tor Project (Global)
US Human Rights Network (United States)
WITNESS (Global)
Woodhull Freedom Foundation (United States)
X-Lab (United States)
Zaina Foundation (Tanzania)

MacDailyNews Take: If you’re concerned about Apple’s recent announcement that the next version of iOS will install a mass surveillance backdoor into Apple devices, The Electronic Frontier Foundation (EEF) has created a petition to let users speak out.

The EFF petition, which we have signed and recommend that our readers sign as well, reads as follows:

Don’t Scan Our Phones

The “child safety” changes Apple plans to install on iOS 15 and macOS Monterey undermine user privacy, and break the promise of end-to-end encryption.

I urge Apple to reconsider these systems. Continuous scanning of images won’t make kids safer, and may well put more of them in danger. Installing the photo-scanning software on our phones will spur governments around the world to ask for more surveillance and censorship abilities than they already have.


Your Name

Tell Apple: Don’t Scan Our Phones


  1. It’s too late now for Apple to ever claim to be the privacy company again. They’ll have to try plow on through and institute this surveillance debacle despite widespread, worldwide objections.

    Tim Cook now opposes nearly 100 civil rights groups worldwide. It’s delicious! The sanctimonious hypocrite deserves his fate.

    The only other option for Apple would be to hastily kill the whole deal and retreat without a plan, leaving Craig Federighi and his troops behind to be slaughtered.

    1. “…retreat without a plan, leaving Craig Federighi and his troops behind to be slaughtered.”

      Kinda like leaving thousands of US citizens behind enemy lines to be slaughtered?

      1. MSM still defends current administration by not reporting the extent of death and chaos.

        The Taliban has a hit list of known Christians they are targeting to pursue and kill. The US Embassy is defunct and there is no longer a safe place for believers to take refuge.

        The Taliban are going door-to-door taking women and children. The people must mark their house with an “X” if they have a girl over 12 years old, so that the Taliban can take them. If they find a young girl and the house was not marked they will execute the entire family. If a married woman 25 years or older has been found, the Taliban promptly kill her husband, do whatever they want to her, and then sell her as a sex slave.

        Wonder if Apple will help with humanitarian aid or use their vocal platform to shine a light on human injustice. Even though the majority of victims are women, children, minorities of religion and gender preference, I bet Apple remain silent like most of the far progressive right. All women must be heard except for these women …. and the women in China

        1. Great Orange President had a plan for getting out:

          US citizens first
          Others second
          US military equipment out
          Taliban “controlled” under conditional pullout under threat of military enforcement
          US troops out

          Biden only did the last. What happened to 1 to 4?

          Why are thousands of US citizens now trapped behind enemy lines?
          Why are UK soldiers going out into Kabul to escort their citizens to the airport, but the US is not?
          Why did Biden GIFT billions of dollars of US military equipment to the Taliban? China and Russia will get their hands on it, too

          Time for the Ice Cream Occupier to go. Oh… but then we are stuck with Harris. Hmm.. what a mess!

        2. Yep! One star for our fake commander in chief.

          SIX STARS for US citizens who are being left behind as you read this post.

          Why are the UK assisting their citizens but Biden is not? Anyone find that just a little weird. Even just a little?

          Meanwhile at the southern border…

        3. Apple cannot do anything to shine any light on this fake administration’s fakery. They can only stand by and smile and nod along with Joe. They voted for and helped him occupy the WH. What will they do, admit they were and are wrong? Nope. Just nods and claps for Joe

    2. Yes ! Didn’t know that this has took a such giant enormous turn in privacy violations ! The way they apologize them self within the reason’s for doing it, is ridiculous ! All of us has buy the smartphone from companies that doesn’t respect user privacy at all, and now they want to play a policeman right on it ? With what crooked legal support ? If then so all of you/all of us iPhone users and i assume Google will do the same/or it is doing it right now on Android – the only hurting punishment payback from user is to abandon the BRAND ! We all are being under mass surveillance without they have even any legal support for that matter and that is criminal doing ! This is exactly why i don’t use internet on my smartphone (its Android powered) the enormous amount of privacy threats is too much for me ! So every user will have to choose to use iPhone or not ! The only problem is that almost every aspect of life interconnectivity in modern west will make almost impossible to turn the spy off !

  2. These human rights groups are actually calling on a company owned and run by the Chinese Communist Party to respect human rights and privacy??? Are these people that ignorant of the history of communist enterprises?

      1. but possibly do that when nsa and others knock on their door, no matter who’s the owner, don’t be so sure, in the west all in power struggle to control all individuals in secret (Snowden) despite having “democracy” that becomes “virtual” and “glassy” and all of our asses are on the line, cos internet and free speech has become a huge danger and disturbed element to elite in power, so they try to control it

  3. Apple’s timing is interesting. The company has waited until the eve of the 15 release to announce CSAM instead of announcing it at WWDC, assuming that is the functionality will appear with the new release if not before or sometime later. This announcement seems rushed and suggests that the company would rather hide it in amongst the welter of software and hardware releases it is making lately. I am hoping that we might learn why Apple would damage its reputation in this way and makes me wonder what other backdoors the company has made or intends to make elsewhere in its software or its new line of computer chips. In any case, I hope knowledgeable people will come forward with an explanation of what I find is on the face of it Apple’s inexplicable behavior.

  4. I would not be surprised to learn that both China and/or USA (and maybe other countries) have collaborated with Apple to do this as a back door for their spying efforts.

    Once a door is opened for any one reason or another – legally, the law will be on the side of “open” for any reason in governmental requests.

  5. There is a way for all of us to force the issue start canceling all your Apple services iCloud for example for that matter cancel Apple TV just because it’s a company where it hurts and they’ll reverse their position

  6. Isn’t it the case that the scanning is done on the device and neither Apple nor anyone else has access to what was scanned? All photos are scanned anyhow for the AI recognition features. This simply says that certain photos cannot be uploaded and no-one knows that there was even an attempt to upload the files to iCloud since it is all performed on device?

    1. I think the first two ‘passes’ to determine if a photo is not CSAM compliant is automated so no human ever sees it, but if the photo is flagged by both I understand a human views the image to make sure it is not a false positive. As for on device and off, I believe the on-device scanning is for iMessage related scans and off-device is scanning the user’s photo folder in iCloud.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.