Apple is now scanning your iCloud photos to check for child exploitation

Apple is scanning every photo uploaded to the iCloud in order to check for any incidents of child exploitation.

Natividad Sidlangan for Apple Gazette:

From this point on, any images backed up to the company’s online storage services, iCloud is to be screened and checked for any illegal activities.

“Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space. As part of this commitment, Apple uses image-matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation. Accounts with child exploitation content violate our terms and conditions of service, and any accounts we find with this material will be disabled,” according to Jane Horvath, Apple’s chief privacy officer, [at CES 2020].

MacDailyNews Take: By the sound of it (email spam filters), we believe Apple is using previously identified images of child exploitation to automatically check against images stored on their iCloud service, not actually analyzing individual images. We fine with the former method, and not with the latter (which we assume Apple isn’t either, given their strong stance on privacy).

17 Comments

  1. Uploading any personal info of yours (files, pictures, video’s etc..) on to a computer that isn’t yours is stupid, use those hardwire cables and those big hard/flash drives, Apple, Facebook, Amazon or Google are not your friend.

  2. Apple: We’ve got your back. Everything is private and secure. Your privacy is our specialty. That is, until we suspect you of doing something that we disapprove of. Then we’ll strip away any rights we’ve given you. Your fate is in our hands and so you must conform to our definition of morality.

    Let the snooping begin, or is it, let the snooping begin to be made public.

  3. If another kind of illegal act is observed along the way and is not reported, doesn’t that have legal ramifications itself for Apple…some sort of complicity? I know filters are “trained” to detect a certain look, but closer human inspection and discovery will always follow.

    Though a person that’s involved with any form of child abuse should be roasted, this “snooping” is is disconcerting as it opens wide another having “ownership/possession” of truths of a person’s life. It wasn’t that long ago it was revealed Apple had workers listening to Siri for “training,” but they let go that crew over the privacy brouhaha. Does announcing this search what saves Apple from breaking their hero-like_we-got-your-privacy-back motto? The hero loses some of their clout upon hearing this news, for me, regardless.

    I have nothing to hide, but I’m not interested in opening myself to something/someone that could some day use against me for whatever law/moire that might emerge in the future. Parallel; who would have ever thought that ancestry.com and other “find your people” sites would have ever become a security risk?

    ““Apple is dedicated to protecting children throughout our ecosystem…”

    Substitute “children” with “animals,” or any reasonable word in the sentence and I have to ask…”why only children, why stop here, what’s next?”

    1. “who would have ever thought that ancestry.com and other “find your people” sites would have ever become a security risk?”
      Anyone with half a brain cell who raises an eyebrow when reading the statement

      Giving your personal detailed information to strangers is a fun and entertaining way to pass the time! Even better when you are paying them for the privilege of taking your information!

  4. I own a MacPo trashcan, 2 ipad pros and I buy at least 2 movies a month from iTunes.
    I have zero use of iCloud, I disabled it on my PCs. I never stream a movie. Buy, download, and rip.

    Apple. do you have a warrant? Because I certainly didn’t appoint you as my IT manager.

  5. I must be in the minority here because I don’t have images of naked children in my library and have no desire to have images of naked children in my library. Just seems like more of a liberal thing, but I reckon some of the folks here just LOOOOVE themselves some images of naked kids. Or, at least as one poster here indicated, finds pictures of naked children “cute”.

  6. Good cause today. Tomorrow it will be the cause du jour. Given Cook’s inability to separate his socio-political causes from Apple, I fully expect this to be misused in before long.

    I’m out. I just ordered a NAS box. I’ll set up my own cloud.

    1. And, this is how things are SUPPOSED to work. People vote with their dollars. There are some people that either have pictures of exploited or nude children or expect to someday acquire pictures of exploited or nude children and they should send a clear signal to Apple that they are NOT in favor of the nanny state attempting to protect exploited children!

      Those few people left that don’t have photos of naked children in their Photos libraries (like me) will NOT be enough to support Apple in the long run, and they will likely change their tune!

        1. That’s already in place, and I trust you’ve made your contribution to the Trump re-election campaign? Unless you’d prefer to have a non-Republican President.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.