Apple has long made privacy central to its marketing pitch for iPhones and iPads, but with the looming installation of a backdoor to scan, ostensibly, for Child Sexual Abuse Materials (CSAM), Apple’s iPhones and iPads will no longer protect user privacy as robustly as they once claimed to do.
Apple unveiled a plan two weeks ago founded in good intentions: Root out images of child sexual abuse from iPhones.
But as is often the case when changes are made to digital privacy and security, technology experts quickly identified the downside: Apple’s approach to scanning people’s private photos could give law enforcement authorities and governments a new way to surveil citizens and persecute dissidents. Once one chip in privacy armor is identified, anyone can attack it, they argued.
“Once you create that back door, it will be used by people whom you don’t want to use it,” said Eva Galperin, the cybersecurity director at the Electronic Frontier Foundation, a digital-rights group. “That is not a theoretical harm. That is a harm we’ve seen happen time and time again.”
Apple was not expecting such backlash.
MacDailyNews Take: Really, after the quote below, Apple wasn’t expecting such backlash? Why the hell not?
Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control…
We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them…
The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.
Privacy means people know what they’re signing up for, in plain English and repeatedly. I believe people are smart and some people want to share more data than other people do. Ask them. Ask them every time. Make them tell you to stop asking them if they get tired of your asking them. Let them know precisely what you’re going to do with their data. — Steve Jobs
Cybersecurity experts, the head of the messaging app WhatsApp and Edward J. Snowden, the former intelligence contractor who leaked classified documents about government surveillance, all denounced the move as setting a dangerous precedent that could enable governments to look into people’s private phones. Apple scheduled four more press briefings to combat what it said were misunderstandings, admitted it had bungled its messaging and announced new safeguards meant to address some concerns. More than 8,000 people responded with an open letter calling on Apple to halt its moves.
As of now, Apple has said it is going forward with the plans. But the company is in a precarious position. It has for years worked to make iPhones more secure, and in turn, it has made privacy central to its marketing pitch…
[Apple’s new software] scans users’ iPhone [and/or iPad] photos and compares them against a database of known child sexual abuse images.
To prevent false positives and hide the images of abuse, Apple took a complex approach. Its software reduces each photo to a unique set of numbers — a sort of image fingerprint called a hash — and then runs them against hashes of known images of child abuse provided by groups like the National Center for Missing and Exploited Children.
If 30 or more of a user’s photos appear to match the abuse images, an Apple employee reviews the matches. If any of the photos show child sexual abuse, Apple sends them to the authorities and locks the user’s account. Apple said it would turn on the feature in the United States over the next several months…
Computer scientists, as well as privacy groups and civil-liberty lawyers, immediately condemned the approach… Other tech companies, like Facebook, Google and Microsoft, also scan users’ photos to look for child sexual abuse, but they do so only on images that are on the companies’ computer servers. In Apple’s case, much of the scanning happens directly on people’s iPhones. (Apple said it would scan photos that users had chosen to upload to its iCloud storage service, but scanning still happens on the phone.)
To many technologists, Apple has opened a Pandora’s box. The tool would be the first technology built into a phone’s operating system that can look at a person’s private data and report it to law enforcement authorities. Privacy groups and security experts are worried that governments looking for criminals, opponents or other targets could find plenty of ways to use such a system.
If governments had previously asked Apple to analyze people’s photos, the company could have responded that it couldn’t. Now that it has built a system that can, Apple must argue that it won’t.
In response, Apple has assured the public that it will not accede to such requests. “We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future,” the company said in a statement.
MacDailyNews Take: Bullshit.
When Apple removed VPN apps – or The New York Times app, or whatever app or service (Apple Books, for one example) the CCP demanded – from the App Store or Apple devices in China, the company told the developers of ExpressVPN, one of the apps affected, that its app was removed because “it includes content that is illegal in China.” In a subsequent statement, Apple said: “Earlier this year China’s MIIT announced that all developers offering VPNs must obtain a license from the government. We have been required to remove some VPN apps in China that do not meet the new regulations.”
Apple, which famously explains away the banning of apps in certain countries by saying that the company must comply with all local laws, might refuse requests without accompanying laws, but, based on a long history, will not refuse requests based on passed laws from governments to use the system to check phones for whatever oppressive governments come up with to monitor, categorize, and punish citizens who do not comply with official doctrine.
Apple, which, under Tim Cook, has attached itself to the China teat so firmly as to be hopelessly compromised, is lying by omission. Yes, they’ll refuse a “request,” but what will they do when it becomes law to scan for political, religious, sexual orientation, etc. materials? Apple will follow the local laws, as they have in every known case of app banning/service removal.
Apple should never introduce this system into China because Apple is not going to stand up to the CCP when the CCP could severely hamper or even shut down Apple’s business in a Bejing minute.
If the U.S., the first country where this scanning is planned, passes a law (for example, to scan for “terrorists,” using a broad definition, of course, which would allow for scanning for virtually anything in every photo on the device), history shows that Apple will follow the law.
Apple will refuse requests from governments until it becomes law. Then, as usual, Apple will follow the laws in each country where it operates.
For years, technologists have argued that giving the police a way into phones would fundamentally undermine the devices’ security, but now governments can point to Apple’s endorsement of its photo-scanning tools as a method that helps the police while preserving privacy.
MacDailyNews Take: So, if Apple no longer protects user privacy, why buy an iPhone or an iPad?
Well, build quality would be among the reasons that spring to mind – although Samsung seems to have figured out how to offer in-display fingerprint ID, pinhole cameras without giant unsightly, content-occluding notches, advanced foldable displays, etc. while the images produced by Google’s Pixel phone cameras are often rated higher than Apple’s best iPhone’s, so that argument isn’t as cut-and-dried as it once was.
Apple’s ecosystem, even if it will no longer protect users’ privacy to the same degree as Apple previously claimed and promoted, is still much more unified across Apple devices with Continuity, so users of multiple Apple devices can certainly benefit. Apple Watch users, of course, will need to stay with iPhone as well.
Currently, iPhones and iPads hold their value much better than competing devices. iPhones and iPads also, in general, get the best apps first; even today, many Android apps are lesser ports of iOS /iPadOS apps.
Apple’s famous ease of use is an advantage over Android phones and tablets, although Google narrows that gulf with each new Android release.
Another consideration is Apple Pay. If you use Apple Pay and want to continue using it, you’ll need an iPhone. Of course, many alternatives, like Google Pay, exist for Android phones, tablets, and watches.
Finally, Apple’s chips are faster and Apple’s control of the whole widget makes for generally faster, smoother, more efficient devices.
So, even if Apple no longer protects user privacy to the extent they once claimed, there are still plenty of reasons to buy an iPhone or an iPad over competing choices.