Apple is said to have dumped its ill-conceieved plan to scan users’ photos stored in iCloud for child sexual abuse material, or CSAM…
Tag: child pornography
After Apple installs iPhone backdoor, Android owners less likely to switch, says survey
After Apple inexplicably threw away iPhone’s privacy advantage by uninstalling a backdoor into iPhone, fewer Android users are considering…
If Apple no longer protects user privacy, why buy an iPhone or an iPad?
Apple has long made privacy central to its marketing pitch for iPhones and iPads, but with the looming installation of a backdoor to scan…
Nearly 100 civil rights groups worldwide call on Apple to abandon CSAM iPhone surveillance
Nearly 100 civil rights groups worldwide have signed an open letter objecting to Apple’s plan to institute iPhone surveillance capabilities…
EFF petition lets users speak out against Apple’s mass surveillance plans
If you’re concerned about Apple’s recent announcement that the next iOS will install a mass surveillance backdoor into every iPhone…
Apple warns retail and online employees to be ready for iPhone backdoor questions
Apple has warned retail and online sales staff to be ready to field questions from consumers about the activation of an iPhone backdoor…
Apple’s ill-considered iPhone backdoor has employees speaking out internally
The backlash over Apple’s ill-considered move to activate an iPhone backdoor ostensibly to scan U.S. customers’ phones and computers…
Open letter against Apple’s privacy-invasive content scanning tech gathers thousands of signatures
An open letter dated August 6, 2021 that’s against Apple’s privacy-invasive content scanning technology in which security & privacy experts…
Privacy, Schmivacy. Apple installs backdoors to iPhones
Apple has announced its plans to bring changes to its operating systems that sound like a massive privacy nightmare to security researchers…
Center for Democracy & Technology: Apple’s backdoor scanning of Photos and Messages threaten users’ security and privacy
The reception to Apple’s backdoor scanning of Photos and Messages is clearly not the welcoming one for which Apple execs may have naively…
EFF: Apple’s photos scanning backdoor is a shocking privacy and security about-face
The EFF says: “Apple is planning to build a backdoor into its data storage system and its messaging system.”
Apple will scan Photos and Messages for Child Sexual Abuse Materials (CSAM)
“Expanded Protections for Children” in which Apple explains it will scan Photos and Messages for Child Sexual Abuse Materials (CSAM)…
Apple said to add iOS photo identification feature to detect child pornography
Apple is close to announcing a new photo identification features that will find photos in user’s photo libraries, such as child pornography …
Telegram iOS app removed from App Store last week due to child pornography
We now have the answer for why the popular messaging app Telegram was pulled from the App Store last week…
Employees sue Microsoft over PTSD after company required them to watch child porn and murder
Two Microsoft employees who had to watch ‘horrific images’ of murder, child pornography and bestiality…