Andrew Grotto explains why Apple is right to continue opposing an iPhone backdoor for law enforcement. Grotto directs the program on Geopolitics, Technology and Governance at Stanford University, and is a visiting fellow at the Hoover Institution. He served as the senior director for Cyber Policy on the National Security Council in both the Obama and Trump administrations.
Apple continues to stand its ground and refuses to re-engineer iPhones so law enforcement can unlock the devices. Apple has maintained that it has done everything required by law and that creating a “backdoor” would undermine cybersecurity and privacy for iPhone users everywhere.
Apple is right to stand firm in its position that building a “backdoor” could put user data at risk.
Law enforcement’s argument that warrant-proof communications could allow bad actors to get away scot-free is valid. The fact is, the more widely deployed strong encryption is, the more likely it is that communications of interest to law enforcement will simply be out of their reach. But that fact alone does not settle it, because it is also true that requiring Apple to reengineer its products to provide law enforcement with access carries its own set of risks. As Apple and others note, there is no way to engineer law enforcement access without introducing risks that bad actors could exploit for their own criminal purposes.
There is one consideration, however, that tips the balance — at least for now. If Apple were required to reengineer its products to enable law enforcement access, it could spark a cat-and-mouse game between law enforcement and those who wish to thwart them. A cottage industry of third-party encryption applications for iPhones and other mobile devices would sprout up overnight if Apple were compelled to build backdoors.
MacDailyNews Take: Can legislators even legally force a company to design their products a certain way in the United States of America?
Regardless, simple logic dictates that an iPhone backdoor would risk the privacy and security of every single iPhone user on earth. It would also risk Apple’s sales worldwide. If Apple created backdoors to their products, sales of iPhones, iPads, Macs, etc. would drop dramatically.
What we really want to see Apple do next is to better explain the basics of this issue to the general public (a 60-second ad in the Super Bowl is certainly with Apple’s capabilities, for one example) while making full iCloud encryption an opt-in option for Apple product users. It should be very clearly stated that if you enable iCloud encryption and then lose/forget your password and fail to have set up a method to reset it in a trusted manner, you will be SOL. This is why Apple has dithered so long on this, we believe (not because the FBI asked Apple to hold off on iCloud encryption; we agree with Daring Fireball’s John Gruber that something is off about that recent Reuters report).
For Apple to live up to their promises of privacy and security, Apple MUST offer full encryption of iCloud data as an option for users who understand what that entails and who wish to retain full ownership of their data stored in iCloud backups.
There is no such thing as a secure “master key” or “backdoor.”
Those who would give up essential liberty to purchase a little temporary safety deserve neither liberty nor safety. – Benjamin Franklin, Historical Review of Pennsylvania, 1759
Why don’t these genius politicians next attempt to legislate in purple unicorns? They’re equally as plentiful as secure backdoors. – MacDailyNews, October 3, 2018
This is not about this phone. This is about the future. And so I do see it as a precedent that should not be done in this country or in any country. This is about civil liberties and is about people’s abilities to protect themselves. If we take encryption away… the only people that would be affected are the good people, not the bad people. Apple doesn’t own encryption. Encryption is readily available in every country in the world, as a matter of fact, the U.S. government sponsors and funds encryption in many cases. And so, if we limit it in some way, the people that we’ll hurt are the good people, not the bad people; they will find it anyway. — Apple CEO Tim Cook, February 2016