Apple will scan Photos and Messages for Child Sexual Abuse Materials (CSAM)

Apple has released the following information via a new webpage entitled “Expanded Protections for Children” in which the company explains it will scan Photos and Messages for Child Sexual Abuse Materials (CSAM).

Apple Inc.:

Expanded Protections for Children

At Apple, our goal is to create technology that empowers people and enriches their lives — while helping them stay safe. We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM).

Apple is introducing new child safety features in three areas, developed in collaboration with child safety experts. First, new communication tools will enable parents to play a more informed role in helping their children navigate communication online. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.

Next, iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.

Finally, updates to Siri and Search provide parents and children expanded information and help if they encounter unsafe situations. Siri and Search will also intervene when users try to search for CSAM-related topics.

These features are coming later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.*

This program is ambitious, and protecting children is an important responsibility. These efforts will evolve and expand over time.

Communication safety in Messages

The Messages app will add new tools to warn children and their parents when receiving or sending sexually explicit photos.

When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it.

Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. The feature is designed so that Apple does not get access to the messages.

This feature is coming in an update later this year to accounts set up as families in iCloud for iOS 15, iPadOS 15, and macOS Monterey.*

Messages will warn children and their parents when receiving or sending sexually explicit photos.
Messages will warn children and their parents when receiving or sending sexually explicit photos.

CSAM detection

Another important concern is the spread of Child Sexual Abuse Material (CSAM) online. CSAM refers to content that depicts sexually explicit activities involving a child.

To help address this, new technology in iOS and iPadOS* will allow Apple to detect known CSAM images stored in iCloud Photos. This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC). NCMEC acts as a comprehensive reporting center for CSAM and works in collaboration with law enforcement agencies across the United States.

Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.

Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.

Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.

Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC. If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated.

This innovative new technology allows Apple to provide valuable and actionable information to NCMEC and law enforcement regarding the proliferation of known CSAM. And it does so while providing significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account. Even in these cases, Apple only learns about images that match known CSAM.

Expanding guidance in Siri and Search

Apple is also expanding guidance in Siri and Search by providing additional resources to help children and parents stay safe online and get help with unsafe situations. For example, users who ask Siri how they can report CSAM or child exploitation will be pointed to resources for where and how to file a report.
Siri and Search are also being updated to intervene when users perform searches for queries related to CSAM. These interventions will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.

These updates to Siri and Search are coming later this year in an update to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.*

Siri will provide resources and help around searches related to CSAM.
Siri will provide resources and help around searches related to CSAM.

More Information

We have provided more information about these features in the documents below, including technical summaries, proofs, and independent assessments of the CSAM-detection system from cryptography and machine learning experts.

Expanded Protections for Children — Technology Summary (PDF)
CSAM Detection — Technical Summary (PDF)
Apple PSI System — Security Protocol and Analysis (PDF)
Technical Assessment of CSAM Detection — Benny Pinkas (PDF)
Technical Assessment of CSAM Detection — David Forsyth (PDF)
Technical Assessment of CSAM Detection — Mihir Bellare (PDF)
Alternative Security Proof of Apple PSI System — Mihir Bellare (PDF)

MacDailyNews Take: “These efforts will evolve and expand over time.” If you didn’t shudder at that line, you should have.

Regardless of the security and privacy protections of Apple’s system, this seems like a slippery slope. Images of the Trojan Horse also spring to mind as we quote ourselves:

Think of The Children™. Whenever you hear that line of horseshit, look for ulterior motives. — MacDailyNews, September 30, 2014

Again, as it’s likely meant to be, this sounds wonderful at first glance (detecting and rooting out purveyors of child pornography – why, everyone’s for that!) and horrible once you think about it for more than a second (massive, awful potential for misuse).

It’s a huge can of worms. And it will do much to negate Apple’s voluminous claims of protecting users’ privacy, regardless of the privacy protections in place.

All of the marketing money, TV ads, and solemn privacy pronouncements from Tim Cook over the last several years can’t stand up to the simple fact that now, in a nutshell, Apple will scan Photos and Messages.

It doesn’t matter what they’re scanning for, because if they can scan for one thing, they can scan for anything.

Smart people will begin looking for alternatives to Apple’s iCloud Photos and Photos and Messages apps.

To turn off your iCloud Photos everywhere, follow these steps:

• On your iPhone, iPad, or iPod touch, go to Settings > [your name] > iCloud > Manage Storage > Photos, then tap Disable & Delete.

• On your Mac, go to Apple menu > System Preferences, then click Apple ID. Choose iCloud, then click Manage. Select Photos, then click Turn Off and Delete.

If you change your mind, follow the steps above on your device then select Undo Delete.

Photos and videos are stored in your account for 30 days. To download your photos and videos on your iOS device, go to Settings > [your name] > iCloud > Photos and select Download and Keep Originals. On your Mac, open Photos, choose Photos > Preferences, then click iCloud, then select Download Originals to this Mac. You can also select the photos and videos that you want to download from iCloud.com.

We hope Apple’s scanning system is never used for anything else besides CSAM, but with the pressures that governments can wield and the amount with which Apple is beholden to China, we highly doubt it.

See also: How to jailbreak your iPhone

33 Comments

  1. Tim Cook is either a fool or evil.

    I think it’s the former, but I haven’t been fully dissuaded of the latter.

    It starts with The Children™.

    It ends with:

    Freedom is never more than one generation away from extinction. We didn’t pass it to our children in the bloodstream. It must be fought for, protected, and handed on for them to do the same, or one day we will spend our sunset years telling our children and our children’s children what it was once like in the United States where men were free. – Ronald Reagan, March 30, 1961

    1. I get that you’re being facetious, but others might not, so one quote for them:

      “Again, as it’s likely meant to be, this sounds wonderful at first glance (detecting and rooting out purveyors of child pornography – why, everyone’s for that!) and horrible once you think about it for more than a second (massive, awful potential for misuse).” – MacDailyNews (above)

      1. Mad Quoter, you provide an important service. You will be very busy.

        MDN said the same thing in the other post, “Apple said to add iOS photo identification feature to detect child pornography”.

        It didn’t work. Those of us who questioned or criticized this move by Apple were accused of being child molesters by TXUser. It was his response to the thread. Go have a look. Af one piece of work.

        1. I’m sorry if the snowflakes who regularly call me a braindead communist piece of lying excrement got offended at receiving some of their own medicine. My more reasoned response was:

          Untwist your knickers! The terms of service for iCloud have stated since 2019 that material stored with Apple is subject to screening for child pornography. The company does that itself, and it responds to lawful subpoenas from law enforcement. This is just moving the screening from the server side to the device. The company is scanning far fewer of your pictures. I would think that is less intrusive, not more so.

          All that is being done is to check that the images are not substantially identical to ones that have already been identified as child porn. So, unless your baby bath pictures have already appeared on a pedophile web site, you should be OK.

          This is not a new thing. Long before we had electronic imaging, the folks who developed your film scanned it for pornography and called the police if they saw any. Google has been checking everything going through its servers, including every Gmail message, since at least 2014.

          Companies like Apple and Google do not really have a choice if they wish to protect themselves from being charged with the possession (and arguably distribution) of child pornography. The Communications Decency Act requires them to take reasonable measures to identify and remove improper content. (I know this annoys some of you, but that requirement prevents the companies from claiming to be neutral digital platforms rather than publishers exercising discretion. That would subject them to absolute liability for third-party posts absent Section 230 of the Act.)

          The technology is scary, and it may represent the top of a slippery slope, but it does not provide Apple customers with any less privacy than they have now.

          The irony is that the people screaming here are, by and large, the same ones who answer any criticism of the Arizona “audit” and its threat to the secrecy of the ballot with a loud, “They wouldn’t object if they didn’t have something to hide.”

        2. The difference between data saved on your phone and the Arizona audit are clear; one is your stuff the other is our stuff.

          And yes, they wouldn’t object if they didn’t have something to hide. I mean, the 2020 elections were the most secure and transparent ever, right?

          I’m sure there is nothing to hide. So, why object? Is counting and checking ballots hurting anyone?

        3. And I’m sure you don’t have anything to hide, so how is letting Apple compare the images you voluntarily chose to store on their server with a database of previously identified child pornography hurting anyone?

        4. The public ask for a count of THEIR votes. Do those votes belong to the government? Are the publics’ votes private property?

          A little different situation than data saved on Apple’s servers.

        5. If it is stored on Apple’s servers, it may be your property, but it is in their possession. The law is called “possession of child pornography” for a reason. Ownership is irrelevant.

          If you have family sharing or otherwise allow somebody else access to those pictures, it is Apple that will be downloading the images to someone who does not already have them. Ever hear of “distribution of child pornography?”

          In the old days, digital platforms could claim to be effectively common carriers that stored and distributed images and other data for everybody who came along, so they weren’t responsible like a publisher for third-party material that was pornographic, libelous, or infringed intellectual property rights. The Communications Decency Act and Millennium Copyright Act changed the situation decades ago. The platforms now have a duty to control illegal material in their possession.

          If you don’t like that, then get Congress to change the law or move to another country. Don’t blame Apple.

        6. “Oh the irony!” Indeed. Also…
          Vaccination Cards Cost $0.00 “Bad” – because Communism apparently…but
          Trump Cards Cost min $50.00 “Good” –
          because Trump Reich needs your loyalty fealty payments.

          “Protect the Unborn” is pro ‘Life’ but –
          Protect children from predatory pedophiles and sexual abuse is an anti freedom slippery slope!
          Faux outrage is a Trump legacy for fools and sycophants.

  2. If this goes through, it will be the end of the line with Apple for me. It’s been a great ride – starting with an all-in-one Performa 5200 iirc – but the increasing enchroachment by big tech, big gov etc. on what I see as my property and personal living sphere is getting too much and out of (my) control.

    How is it Apple’s decision to unilaterally decide to literally digitally invade on my devices and on my personal data? Without my consent even? The motivation is neither here nor there. Freedoms are always taken away with “think of the children”, “but those terrosists”, “the public good” and always results in authoraties growing in power. You never get it back without fighting for it.

    And ofcourse it is a slippery slope, it always happens in slow and seemingly small steps. But step out far enough and the direction is clear. Goalposts and treshholds move all the time, to favor the powers that be at any particular moment in time. Panta rei. That is why strict adherence to fundamental principles must be applied at all times, to help guard against confusion and fear tthat can overwhelm oneself .

    Freedom that is not absolute is not freedom at all.

    No regrets with being with Apple for about 25 years or so, it served me very well. But my growing skepticism with Apple the last few years on all sorts of stuff already lowered my enthousiasm considerably, and I stopped being a Mac evangelical in my personal sphere several years ago.

    So thanks Apple for the good years, but Linux will be my next busstop. Will start building a fanless PC next year or so. Even at almost 60, I like a challenge to keep mind and spirit fresh.

    1. I feel ya, Panda. Back in the darkest days I converted a dozen people from PC to Mac. Used to put the Apple decal on my cars.

      Haven’t done that in about 9 years.

      Maybe there will be a resurgence of cottage industry homemade computers, this time based on Linux rather than DOS?

  3. Beyond all the very valid concerns stated above… Wouldn’t a direct official announcement of this sort from Apple pre-warn the child abusers.. making them flock to Android or other devices…

  4. Untwist your knickers! The terms of service for iCloud have stated since 2019 that material stored with Apple is subject to screening for child pornography. The company does that itself, and it responds to lawful subpoenas from law enforcement. This is just moving the screening from the server side to the device. The company is scanning far fewer of your pictures. I would think that is less intrusive, not more so.

    All that is being done is to check that the images are not substantially identical to ones that have already been identified as child porn. So, unless your baby bath pictures have already appeared on a pedophile web site, you should be OK.

    This is not a new thing. Long before we had electronic imaging, the folks who developed your film scanned it for pornography and called the police if they saw any. Google has been checking everything going through its servers, including every Gmail message, since at least 2014.

    Companies like Apple and Google do not really have a choice if they wish to protect themselves from being charged with the possession (and arguably distribution) of child pornography. The Communications Decency Act requires them to take reasonable measures to identify and remove improper content. (I know this annoys some of you, but that requirement prevents the companies from claiming to be neutral digital platforms rather than publishers exercising discretion. That would subject them to absolute liability for third-party posts absent Section 230 of the Act.)

    The technology is scary, and it may represent the top of a slippery slope, but it does not provide Apple customers with any less privacy than they have now.

    The irony is that the people screaming here are, by and large, the same ones who answer any criticism of the Arizona “audit” and its threat to the secrecy of the ballot with a loud, “They wouldn’t object if they didn’t have something to hide.”

    1. “The irony is that the people screaming here are, by and large, the same ones who answer any criticism of the Arizona “audit” and its threat to the secrecy of the ballot with a loud, ‘They wouldn’t object if they didn’t have something to hide.'”

      Why are you equating a stolen election with child pornography? You must be one sick dude!

    2. The solution is not that difficult to comprehend. It worked for over 200 years. It’s called: get a search warrant. No searches without a warrant. Laws don’t stop just because a private company is involved. Neither do God given rights.

      If the police can show probable cause, they can go to a judge and get a warrant.

      1. Worked for 200 years … until the law changed in 1996 as TxUser pointed out.

        Regarding the twitter concerns of Mr Green, if he doesn’t like the implications, then he should go work to rescind the 1996 Communications Decency Act.

        Setting aside politics to talk tech…

        One classical image matching approach was to use LaPlace Transforms. Seems that the expressed concerns about bad hashes being imprecise doesn’t understand that these simplest technical approach transforms at the pixel-level and only returns perfect matches.

        This observation merely illustrates how hard it is to do as well as an attentive human, as this then requires the machine to also include matching on imperfect digital copies, particularly those that were deliberately altered in an attempt to fool the machines.

        For anyone who’s monitored reverse image searches over the past decade, they’ll recall the back when something as trivial as merely cropping <2% of an image (e.g., to remove a copyright/etc) was sufficient to fool the algorithms into believing it wasn’t a match. Some of the online free services can still be fooled in this fashion.

  5. This is stunning and stunningly bad. Apple’s privacy “commitment” was the one anchor that kept loyal to the company through several glitchy products and questionable decisions. Now, not. I have already disabled auto-updates on my phone and iPad and will do so immediately on the wife’s. Will warn grown kids the same. No more Apple purchases for personal or business, not even the 32″ M-powered iMac I have been waiting for. George Orwell was, sadly, right.

  6. Marxists always start with an issue that can’t be argued against – only the evil are for child porn and foolish children will regret sharing nude selfies.

    But…a Billionaire Marxist surely would have not evil intent.

    Sure, he personally banned apps that went against what he calls his values, because He knows better. Certainly those personal values aren’t shaped by his sexual orientation choice or political affiliate…right?

    Riiiiiiiiiiiiight….

  7. It is not Apple’s role to warn my children about what they are looking at/for on the internet. That’s my job.

    It’s not Apples job to go through my photos for any reason.

    Apple makes devices and sells them, that’s where their role should end.

  8. The police cannot legally invade your home and wander around searching your computer and sock drawer. They usually need a warrant. The apple search tactics now put into place make apple a police agent, and so they have to apple the same constitutional protections that the police have to follow. I am really curious how the courts will look at this.
    Yes, child porn is a horror. But also yes, China, anyone in the Middle East , any authoritarian government, the US, will be able to use client side software now.
    This has to be taken to court the minute it is implemented.
    Sure would make my decisions as to buying Apple products easier.

    1. So you are suggesting that if somebody witnesses a crime and calls the police, that makes them a government actor who must comply with the Fourth and Fifth Amendments? Sorry, but it does not work that way. Any citizen can report a crime he witnesses and, in some cases (child abuse being one of them), reporting is mandatory.

      For example, you invite a neighbor to come into your house to feed your cat while you are away and they see a pile of heroin packets on your kitchen table next to the cat bowl. Obviously, they can report that to the police. They do not need a warrant, although the police will need a warrant (based on the neighbor’s call as probable cause) to conduct their own search.

      How is it different if you invite Apple to keep your pictures for you after they tell you that they will be checking the pictures for child pornography so that the company can protect itself from criminal charges? I am not sure why you think it is better for them to look at ALL your images to detect child pornography, rather than looking only at those images that have been flagged by your phone as suspicious. Either way, they are going to be looking for illegal images and reporting them to law enforcement exactly as the law requires.

      The technology for conducting client-side searches has existed for years, if not decades. Any halfway competent coder could implement it. A totalitarian government is no more likely to require Apple to install it now than they were ten years ago. If they do that, Apple will be free to refuse and stop operating in that country. We have every reason to believe that they would do that before they wrote spyware.

      Why not wait until that happens, rather than threatening to put Apple out of business for something that is no more of a reality today than it was five years ago?

  9. Apple has one job, build insanely great products to empower its customers to change the (their) world. If Apple wants to get into the morality business, then let them priest or rabbis. This techno-authoritarianism is dangerous.

  10. I don’t mind parents being given the capability to monitor their children but Apple should not be in the loop at all. This is the end of Apple device privacy as we have known it. Since many people bought Apple devices based, in part or in whole, on privacy expectations, I see this ending up in court. May have to completely rethink our relationship with Apple now.

  11. I don’t see the problem. Of course we can trust our government. It’s not like there are a bunch of political prisoners being held in the Washington DC jails. Oh, wait…..

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.