Apple is scanning every photo uploaded to the iCloud in order to check for any incidents of child exploitation.
From this point on, any images backed up to the company’s online storage services, iCloud is to be screened and checked for any illegal activities.
“Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space. As part of this commitment, Apple uses image-matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation. Accounts with child exploitation content violate our terms and conditions of service, and any accounts we find with this material will be disabled,” according to Jane Horvath, Apple’s chief privacy officer, [at CES 2020].
MacDailyNews Take: By the sound of it (email spam filters), we believe Apple is using previously identified images of child exploitation to automatically check against images stored on their iCloud service, not actually analyzing individual images. We fine with the former method, and not with the latter (which we assume Apple isn’t either, given their strong stance on privacy).