The backlash over Apple’s ill-considered move to activate an iPhone backdoor ostensibly to scan U.S. customers’ phones and computers for Child Sexual Abuse Material (CSAM) has grown to include employees speaking out internally, as well as generating intensified protests from leading technology policy groups.
Joseph Menn and Julia Love for Reuters:
Apple employees have flooded an Apple internal Slack channel with more than 800 messages on the plan announced a week ago, workers who asked not to be identified told Reuters. Many expressed worries that the feature could be exploited by repressive governments looking to find other material for censorship or arrests, according to workers who saw the days-long thread.
Past security changes at Apple have also prompted concern among employees, but the volume and duration of the new debate is surprising, the workers said. Some posters worried that Apple is damaging its leading reputation for protecting privacy.
MacDailyNews Take: Of course the idea of activating an iPhone backdoor has damaged and is continuing to damage Apple’s reputation for protecting privacy.
When MacDailyNews staff, after examining Apple’s scheme, goes searching for “most secure smartphones” in order to replace their Apple iPhones, you’ve got issues.
Apple spent untold millions promoting “privacy” over a period of many years, only to squander it in one fell swoop.
Last week’s announcement is drawing heavier criticism from past outside supporters who say Apple is rejecting a history of well-marketed privacy fights.
People familiar with the matter said a coalition of policy groups are finalizing a letter of protest to send to Apple within days demanding a suspension of the plan. Two groups, the Electronic Frontier Foundation (EFF) and the Center for Democracy and Technology (CDT) both released newly detailed objections to Apple’s plan in the past 24 hours.
“What Apple is showing with their announcement last week is that there are technical weaknesses that they are willing to build in,” CDT project director Emma Llanso said in an interview. “It seems so out of step from everything that they had previously been saying and doing.”
Apple declined to comment for this story. It has said it will refuse requests from governments to use the system to check phones for anything other than illegal child sexual abuse material.
MacDailyNews Take: Bullshit.
When Apple removed VPN apps – or The New York Times app, or whatever app or service (Apple Books, for one example) the CCP demanded – from the App Store or Apple devices in China, the company told the developers of ExpressVPN, one of the apps affected, that its app was removed because “it includes content that is illegal in China.” In a subsequent statement, Apple said: “Earlier this year China’s MIIT announced that all developers offering VPNs must obtain a license from the government. We have been required to remove some VPN apps in China that do not meet the new regulations.”
Apple, which famously explains away the banning of apps in certain countries by saying that the company must comply with all local laws, might refuse requests without accompanying laws, but, based on a long history, will not refuse requests based on passed laws from governments to use the system to check phones for whatever oppressive governments come up with to monitor, categorize, and punish citizens who do not comply with official doctrine.
Apple, which, under Tim Cook, has attached itself to the China teat so firmly as to be hopelessly compromised, is lying by omission. Yes, they’ll refuse a “request,” but what will they do when it becomes law to scan for political, religious, sexual orientation, etc. materials? Apple will follow the local laws, as they have in every known case of app banning/service removal.
Apple should never introduce this system into China because Apple is not going to stand up to the CCP when the CCP could severely hamper or even shut down Apple’s business in a Bejing minute.
If the U.S., the first country where this scanning is planned, passes a law (for example, to scan for “terrorists,” using a broad definition, of course, which would allow for scanning for virtually anything in every photo on the device), history shows that Apple will follow the law.
Apple will refuse requests from governments until it becomes law. Then, as usual, Apple will follow the laws in each country where it operates.
…Apple has made other technical decisions that help authorities, including dropping a plan here to encrypt widely used iCloud backups and agreeing to store Chinese user data in that country.
A fundamental problem with Apple’s new plan on scanning child abuse images, critics said, is that the company is making cautious policy decisions that it can be forced to change, now that the capability is there, in exactly the same way it warned would happen if it broke into the terrorism suspect’s phone.
Apple says it will scan only in the United States and other countries to be added one by one, only when images are set to be uploaded to iCloud, and only for images that have been identified by the National Center for Exploited and Missing Children and a small number of other groups.
MacDailyNews Take: Whoever controls the database, or infiltrates the database, owns 1+ billion devices thanks to Apple’s iPhone backdoor.
Apple Inc., “A Message to Our Customers,” February 16, 2016:
We have great respect for the professionals at the FBI, and we believe their intentions are good.
[Ah, to be so naive. Ignorance is indeed bliss. But, we digress – MDN Ed.]
Apple continued:
Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.
Uh, yeah. Duh.
Apple continued:
We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.
Gee, ya think?
Apple concluded:
The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.
Ouch.
And, who, exactly, are these “small number of other groups?” Since Apple is stupidly admitting them into users’ devices to scan their photos without users’ permission, the company should not only list this mystery “small number of other groups,” they should also explain how each of their databases are perfectly secure and hack-proof, including by state actors.
Good luck with that.
But any country’s legislature or courts could demand that any one of those elements be expanded, and some of those nations, such as China, represent enormous and hard to refuse markets, critics said.
Police and other agencies will cite recent laws requiring “technical assistance” in investigating crimes, including in the United Kingdom and Australia, to press Apple to expand this new capability, the EFF said.
“The infrastructure needed to roll out Apple’s proposed changes makes it harder to say that additional surveillance is not technically feasible,” wrote EFF General Counsel Kurt Opsahl.
MacDailyNews Take: We oppose Apple’s ill-considered move to activate an iPhone backdoor intended to scan customers’ devices without their permission and have signed the Open Letter Against Apple’s Privacy-Invasive Content Scanning Technology which asks that:
1. Apple Inc.’s deployment of its proposed content monitoring technology is halted immediately.
2. Apple Inc. issue a statement reaffirming their commitment to end-to-end encryption and to user privacy.
More info and link to the open letter against Apple’s privacy-invasive content scanning scheme here.