Nearly 100 civil rights groups worldwide have signed an open letter objecting to Apple’s plan to institute iPhone surveillance capabilities, asking Apple to abandon its ill-conceived scheme for CSAM scanning. The letter spotlights the risk of misuse by governments around the world.
On August 5, 2021, Apple announced that it will be installing surveillance software that will conduct on-device scanning in Messages and photos. As the coalition’s letter explains, although the new features are designed to protect children and reduce the spread of child sexual abuse material (CSAM), they will create new risks for children and could be used to censor speech and threaten the privacy and security of people around the world.
• The scan and alert feature in Messages could result in alerts that threaten the safety and wellbeing of some young people, and LGBTQ+ youths with unsympathetic parents are particularly at risk.
• Once the CSAM hash scanning for photos is built into Apple products, the company will face enormous pressure, and possibly legal requirements, from governments around the world to scan for all sorts of images that the governments find objectionable.
The breadth of the international coalition joining the letter demonstrates the extent to which Apple’s plans open the door to threats to human rights across the globe.
19 August 2021
CEO, Apple, Inc.
Dear Mr. Cook:
The undersigned organisations committed to civil rights, human rights, and digital rights around the world are writing to urge Apple to abandon the plans it announced on 5 August 2021 to build surveillance capabilities into iPhones, iPads, and other Apple products. Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material (CSAM), we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.
Apple announced that it is deploying a machine learning algorithm to scan images in its text messaging service, Messages, to detect sexually explicit material sent to or from people identified as children on family accounts. This surveillance capability will be built right into Apple devices. When the algorithm detects a sexually explicit image, it warns the user that the image may be sensitive. It also sends a notice to the organiser of a family account whenever a user under age 13 chooses to send or to receive the image.
Algorithms designed to detect sexually explicit material are notoriously unreliable. They are prone to mistakenly flag art, health information, educational resources, advocacy messages, and other imagery. Children’s rights to send and receive such information are protected in the U.N. Convention on the Rights of the Child. Moreover, the system Apple has developed assumes that the “parent” and “child” accounts involved actually belong to an adult who is the parent of a child, and that those individuals have a healthy relationship. This may not always be the case; an abusive adult may be the organiser of the account, and the consequences of parental notification could threaten the child’s safety and wellbeing. LGBTQ+ youths on family accounts with unsympathetic parents are particularly at risk. As a result of this change, iMessages will no longer provide confidentiality and privacy to those users through an end-to-end encrypted messaging system in which only the sender and intended recipients have access to the information sent. Once this backdoor feature is built in, governments could compel Apple to extend notification to other accounts, and to detect images that are objectionable for reasons other than being sexually explicit.
Apple also announced that it would build into the operating system of its products a hash database of CSAM images provided by the National Center for Missing and Exploited Children in the United States and other child safety organisations. It will scan against that database every photo its users upload to iCloud. When a preset threshold number of matches is met, it will disable the account and report the user and those images to authorities. Many users routinely upload the photos they take to iCloud. For these users, image surveillance is not something they can opt out of; it will be built into their iPhone or other Apple device, and into their iCloud account.
Once this capability is built into Apple products, the company and its competitors will face enormous pressure — and potentially legal requirements — from governments around the world to scan photos not just for CSAM, but also for other images a government finds objectionable. Those images may be of human rights abuses, political protests, images companies have tagged as “terrorist” or violent extremist content, or even unflattering images of the very politicians who will pressure the company to scan for them. And that pressure could extend to all images stored on the device, not just those uploaded to iCloud. Thus, Apple will have laid the foundation for censorship, surveillance, and persecution on a global basis.
We support efforts to protect children and stand firmly against the proliferation of CSAM. But the changes that Apple has announced put children and its other users at risk, both now and in the future. We urge Apple to abandon those changes and to reaffirm the company’s commitment to protecting its users with end-to-end encryption. We also urge Apple to more regularly consult with civil society groups, and with vulnerable communities who may be disproportionately impacted by changes to its products and services.
Access Now (Global)
Advocacy for Principled Action in Government (United States) African Academic Network on Internet Policy (Africa)
American Civil Liberties Union (United States)
Aqualtune Lab (Brasil)
Asociación por los Derechos Civiles (ADC) (Argentina) Association for Progressive Communications (APC) (Global) Barracón Digital (Honduras)
Beyond Saving Lives Foundation (Africa)
Big Brother Watch (United Kingdom)
Body & Data (Nepal)
Canadian Civil Liberties Association
CAPÍTULO GUATEMALA DE INTERNET SOCIETY (Guatemala) Center for Democracy & Technology (United States)
Centre for Free Expression (Canada)
CILIP/ Bürgerrechte & Polizei (Germany)
Código Sur (Centroamerica)
Community NetHUBs Africa
Dangerous Speech Project (United States)
Defending Rights & Dissent (United States)
Demand Progress Education Fund (United States)
Derechos Digitales (Latin America)
Digital Rights Foundation (Pakistan)
Digital Rights Watch (Australia)
DNS Africa Online (Africa)
Electronic Frontier Foundation (United States) EngageMedia (Asia-Pacific)
Eticas Foundation (Spain)
European Center for Not-for-Profit Law (ECNL) (Europe)
Fight for the Future (United States)
Free Speech Coalition Inc. (FSC) (United States)
Fundación Karisma (Colombia)
Global Forum for Media Development (GFMD) (Belgium)
Global Partners Digital (United Kingdom)
Global Voices (Netherlands)
Instituto Beta: Internet & Democracia – IBIDEM (Brazil)
Instituto de Referência em Internet e Sociedade – IRIS (Brazil) Instituto Liberdade Digital – ILD (Brazil)
Instituto Nupef (Brazil)
Internet Governance Project, Georgia Institute of Technology (Global) Internet Society Panama Chapter
Interpeer Project (Germany)
IP.rec – Law and Technology Research Institute of Recife (Brazil) IPANDETEC Central America
ISOC Brazil – Brazilian Chapter of the Internet Society
ISOC Chapter Dominican Republic
ISOC India Hyderabad Chapter
ISOC Paraguay Chapter
ISOC Senegal Chapter
Kijiji Yeetu (Kenya)
LGBT Technology Partnership & Institute (United States)
Liberty (United Kingdom)
May First Movement Technology (United States)
National Coalition Against Censorship (United States)
National Working Positive Coalition (United States)
New America’s Open Technology Institute (United States)
OhmTel Ltda (Columbia)
OpenMedia (Canada/United States)
Paradigm Initiative (PIN) (Africa)
PDX Privacy (United States)
PEN America (Global)
Privacy International (Global)
PRIVACY LATAM (Argentina)
Progressive Technology Project (United States)
Prostasia Foundation (United States)
R3D: Red en Defensa de los Derechos Digitales (Mexico)
Ranking Digital Rights (United States)
S.T.O.P. – Surveillance Technology Oversight Project (United States) Samuelson-Glushko Canadian Internet Policy & Public Interest Clinic (CIPPIC) Sero Project (United States)
Simply Secure (United States)
Software Freedom Law Center, India
SWOP Behind Bars (United States)
Tech for Good Asia (Hong Kong)
The DKT Liberty Project (United States)
The Sex Workers Project of the Urban Justice Center (United States)
The Tor Project (Global)
US Human Rights Network (United States)
Woodhull Freedom Foundation (United States)
X-Lab (United States)
Zaina Foundation (Tanzania)
MacDailyNews Take: If you’re concerned about Apple’s recent announcement that the next version of iOS will install a mass surveillance backdoor into Apple devices, The Electronic Frontier Foundation (EEF) has created a petition to let users speak out.
The EFF petition, which we have signed and recommend that our readers sign as well, reads as follows:
Don’t Scan Our Phones
The “child safety” changes Apple plans to install on iOS 15 and macOS Monterey undermine user privacy, and break the promise of end-to-end encryption.
I urge Apple to reconsider these systems. Continuous scanning of images won’t make kids safer, and may well put more of them in danger. Installing the photo-scanning software on our phones will spur governments around the world to ask for more surveillance and censorship abilities than they already have.