There’s never been an x.8 release of iOS before, but that’s about to change. Apple is prepping iOS 14.8 for release, and it’s coming soon.
iOS 14 is also different in that Apple has said that will allow users to stay on this release and still get patches after iOS 15 has been released.
MacDailyNews Take: As we explained last month, “Beyond a handful of test devices, we will not be upgrading to iOS 15.”
We’re currently on iOS 14.7.1, but for weeks now we’ve been hearing about an iOS 14.8, but what’s weird is that there’s no beta and only speculation about what the update might contain.
A contact at Apple says that it’s likely to be a bugfix that also contains the framework to allow iPhone users to stick with iOS 14 once iOS 15 is released later this month.
But what about Apple’s child sexual abuse material (CSAM) detection system. Is Apple planning to roll this out to iPhones running iOS 14?
According to what I’ve been told, that’s unlikely at this stage. However, given that iOS 14 will carry on getting updates beyond September of this year, its inclusion in a future update can’t be ruled out.
MacDailyNews Take: Thank you, oh Great and Powerful Apple, for “allowing” us to run the last OS version that preserves user privacy on devices we own!
From now on, for those concerned with privacy, we’d recommend not upgrading to Apple OS releases until they can be vetted by third parties to be free of Photos and Messages surveillance software.
An international coalition of 90+ civil society organizations have called on Apple to abandon its plans to build backdoor #surveillance capabilities into iPhones, iPads and other #Apple products.https://t.co/BeEAuCiNEA
— MacDailyNews (@MacDailyNews) September 1, 2021
Apple must have been placed in an untenable situation to introduce this backdoor, destroying their vaunted claims to protecting privacy, or Tim Cook has completely lost the plot.
“There have been people that suggest that we should have a backdoor. But the reality is if you put a backdoor in, that backdoor’s for everybody, for good guys and bad guys… I think everybody’s coming around also to recognizing that any backdoor means a backdoor for bad guys as well as good guys. And so a backdoor is a nonstarter. It means we are all not safe… I don’t support a backdoor for any government, ever. We do think that people want us to help them keep their lives private. We see that privacy is a fundamental human right that people have. We are going to do everything that we can to help maintain that trust.” — Apple CEO Tim Cook, October 1, 2015
Things that make you go hmmm. When somebody spends years insisting, correctly, that water is wet and then, all of a sudden, claims that, no, water is actually dry, then clearly something is rotten in the state of Denmark.
Something caused Tim Cook to kowtow to this backdoor. What was it? – MacDailyNews, August 6, 2021
If you’re concerned about Apple’s recent announcement that the next version of iOS will install a mass surveillance backdoor into Apple devices, The Electronic Frontier Foundation (EEF) has created a petition to let users speak out.
Mass surveillance is not an acceptable crime-fighting strategy, no matter how well-intentioned the spying. If you’re upset about Apple’s recent announcement that the next version of iOS will install surveillance software in every iPhone, we need you to speak out about it.
Last year, EFF supporters spoke out and stopped the EARN IT bill, a government scheme that could have enabled the scanning of every message online. We need to harness that same energy to let Apple know that its plan to enable the scanning of photos on every iPhone is unacceptable.
Apple plans to install two scanning systems on all of its phones. One system will scan photos uploaded to iCloud and compare them to a database of child abuse images maintained by various entities, including the National Center for Missing and Exploited Children (NCMEC), a quasi-governmental agency created by Congress to help law enforcement investigate crimes against children. The other system, which operates when parents opt into it, will examine iMessages sent by minors and compare them to an algorithm that looks for any type of “sexually explicit” material. If an explicit image is detected, the phone will notify either the user and possibly the user’s parent, depending on age.
These combined systems are a danger to our privacy and security. The iPhone scanning harms privacy for all iCloud photo users, continuously scanning user photos to compare them to a secret government-created database of child abuse images. The parental notification scanner uses on-device machine learning to scan messages, then informs a third party, which breaks the promise of end-to-end encryption.
Apple’s surveillance plans don’t account for abusive parents, much less authoritarian governments that will push to expand it. Don’t let Apple betray its users.
MacDailyNews Take: The EFF petition, which we have signed and recommend that our readers sign as well, reads as follows:
Don’t Scan Our Phones
The “child safety” changes Apple plans to install on iOS 15 and macOS Monterey undermine user privacy, and break the promise of end-to-end encryption.
I urge Apple to reconsider these systems. Continuous scanning of images won’t make kids safer, and may well put more of them in danger. Installing the photo-scanning software on our phones will spur governments around the world to ask for more surveillance and censorship abilities than they already have.