How Apple ‘intercepts’ child abuse images in iCloud and email while protecting privacy

Thanks to a search warrant uncovered by Forbes, we now know for the first time how Apple checks and intercepts email and files stored on iCloud when instances of illegal material – such as child abuse images – are detected.

Thomas Brewster for Forbes:

Apple scanning for child abuse imagesThe warrant, filed in Seattle, Washington, this week, shows that despite reports of Apple being unhelpful in serious law enforcement cases, it’s being helpful in investigations.

To be clear: Apple isn’t manually checking all of your emails. It uses what most other major tech companies like Facebook or Google use to detect child abuse imagery: hashes. Think of these hashes as signatures attached to previously-identified child abuse photos and videos. When Apple systems – not staff – see one of those hashes passing through the company’s servers, a flag will go up. The email or file containing the potentially illegal images will be quarantined for further inspection.

Once the threshold has been met, that’s enough for a tech company to contact the relevant authority, typically the National Center for Missing and Exploited Children (NCMEC)… But in Apple’s case, its staff are clearly being more helpful, first stopping emails containing abuse material from being sent. A staff member then looks at the content of the files and analyzes the emails. That’s according to a search warrant in which the investigating officer published an Apple employee’s comments on how they first detected “several images of suspected child pornography” being uploaded by an iCloud user and then looked at their emails…

MacDailyNews Take: We agree with Brewster that “as long as Apple employees are only looking into emails when abusive images are detected by its computing systems” there isn’t be much of a privacy issue here. After all, “Apple, like all tech companies, has to balance privacy with safety,” Brewster writes. The only problem we foresee is that one could conceivably use hashes to detect other types of content (think financial, political, etc.) and abuse user privacy. Not that we think Apple would do that, but unscrupulous data-hungry tech firms and government agencies certainly could, and likely already are, doing so with unencrypted “free” email and cloud storage services.

12 Comments

  1. Privacy is always key and you have to fight to keep it or it will be whittled away bit by bit. I pissed off a union truck driver on a film set and animal porn materials showed up on my desk. It was obvious who did it but it’s too easy to entrap someone.

  2. I have no doubt that Apple had the concern and commitment to address these problems in an intelligent and comprehensive manner. They certainly have the technical talent that can be applied to addressing the problems and a willingness to work with legitimate organizations (like the National Center for Missing and Exploited Children) to add force to their efforts.

    After raising 2 kids and, now, enjoying three wonderful grandkids there is no doubt in my mind that Apple’s efforts are far more important than the privacy of the slime that harm kids.

    1. Ken, I would trust you over Apple to properly parent children. Apple’s intentions may be good, but any flaws in their approach or execution could affect hundreds of millions.

      OPTIONS for a parent to get some help are great, with full disclosure. Apple can and should be working on parental software to help them better manage their kids exposure to digital threats of all kinds. However Apple forced scanning of all personal data in iCloud is not acceptable, period, ever. This is the first move of a Big Brother, as forecasted in Apple’s own 1984 commercial.

    2. Paul, the problem isn’t children downloading porn. Parental controls can certainly help with that. The issue being discussed here is adults uploading sexual images of children. Parental controls can’t do anything about that.

      The issue isn’t just that the images will hang around the Internet forever, branding the children permanently. It is that almost all of the images were produced in the course of sexually abusing the child models. Child trafficking is largely funded by the sale of pornographic images. If you cannot control sales of this material, you can never control its production.

      In this case, Apple is only monitoring images that have already been identified as child pornography by law enforcement or a similar authority. New dirty pictures will have a different hash and will pass through unviewed and unimpeded.

      If Apple, ISPs, and similar agencies do not take at least these minimal steps, it will increase the pressure to eliminate secure encryption entirely and to require monitoring and censorship of our private communications. That would have profound consequences for whatever is left of our constitutional protections against government snooping and censorship.

      1. High and mighty words, encryption will end for the common man and woman only those high up will have it, kids and young adults are being work to death now around the world the corporate world is a world of no values.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.