Apple is said to have dumped its ill-conceieved plan to scan users’ photos stored in iCloud for child sexual abuse material, or CSAM, amid a renewed privacy push.
Bradford Betz for Fox Business:
These safety tools, announced in August 2021, were meant to flag illicit content while preserving privacy. But the plans drew widespread criticism from digital rights groups who argued that the surveillance capabilities were ripe for potential abuse.
Apple put the plans on pause a month later. Now, more than a year after its announcement, the company has no plans to move forward with the CSAM-detection tool…
Apple says the best way to prevent online exploitation of children is to interrupt it before it happens. The company pointed to new features it rolled out in December 2021 that enabled this process.
The company announced Wednesday it will now offer full end-to-end encryption for nearly all the data its users store in its global cloud-based storage system, making it more difficult for hackers, spies and law enforcement agencies to access sensitive user information.
MacDailyNews Take: if true, Apple is to be commended for waking up, dumping its disastrous scheme to scan users’ photos without their permission (to which we were vehemently opposed from day one), and for finally enabling end-to-end encryption to iCloud Backup, Photos, Notes, and more!
See also:
• Apple will scan Photos and Messages for Child Sexual Abuse Materials (CSAM) – August 5, 2021
• If Apple no longer protects user privacy, why buy an iPhone or an iPad? – August 19, 2021
• Apple brings end-to-end encryption to iCloud Backup, Photos, Notes, and more – December 7, 2022
Please help support MacDailyNews. Click or tap here to support our independent tech blog. Thank you!
Shop The Apple Store at Amazon.

As Apple ramps up user security the FBI and other Gubbermint bad actors will scream. Too bad !!!! Their collective misconduct and ILLEGAL activities demand they be excluded as the miscreants they have become……..
oh sorry ….too soon.,…???
but but but we would never abuse our AuThORiTy………..!!!
BS
Yessir! Nailed it!
If this is indeed true, then I can finally upgrade Mac OS 11.7 and iOS 14.8.1
I have been waiting 🙂
Extremel unwise and disruptive for companies to take the role of MORALITY cops. Morality cops are driven by their own hidden salacious perversions (in this case scanning photographs of children).
Sure. After the Twitter debacle. Brace yourselves for every tech company to start covering their a***s in a similar fashion, because they were all doing, or were at least prepared to do, the same go******d thing as Twitter. This should have never been a conversation. The object lesson has been taught, and these companies were all cowards when it mattered. Guess we should be grateful for small favors, even from Apple, however they come about (I highly doubt that convo between Musk and Cook was just about the App Store).