Why Apple is right to oppose an iPhone backdoor for law enforcement

Andrew Grotto explains why Apple is right to continue opposing an iPhone backdoor for law enforcement. Grotto directs the program on Geopolitics, Technology and Governance at Stanford University, and is a visiting fellow at the Hoover Institution. He served as the senior director for Cyber Policy on the National Security Council in both the Obama and Trump administrations.

Andrew Grotto for CNN Business Perspectives:

iPhone backdoor - iPhone passcode lock screen
iPhone passcode lock screen
Apple continues to stand its ground and refuses to re-engineer iPhones so law enforcement can unlock the devices. Apple has maintained that it has done everything required by law and that creating a “backdoor” would undermine cybersecurity and privacy for iPhone users everywhere.

Apple is right to stand firm in its position that building a “backdoor” could put user data at risk.

Law enforcement’s argument that warrant-proof communications could allow bad actors to get away scot-free is valid. The fact is, the more widely deployed strong encryption is, the more likely it is that communications of interest to law enforcement will simply be out of their reach. But that fact alone does not settle it, because it is also true that requiring Apple to reengineer its products to provide law enforcement with access carries its own set of risks. As Apple and others note, there is no way to engineer law enforcement access without introducing risks that bad actors could exploit for their own criminal purposes.

There is one consideration, however, that tips the balance — at least for now. If Apple were required to reengineer its products to enable law enforcement access, it could spark a cat-and-mouse game between law enforcement and those who wish to thwart them. A cottage industry of third-party encryption applications for iPhones and other mobile devices would sprout up overnight if Apple were compelled to build backdoors.

MacDailyNews Take: Can legislators even legally force a company to design their products a certain way in the United States of America?

Regardless, simple logic dictates that an iPhone backdoor would risk the privacy and security of every single iPhone user on earth. It would also risk Apple’s sales worldwide. If Apple created backdoors to their products, sales of iPhones, iPads, Macs, etc. would drop dramatically.

What we really want to see Apple do next is to better explain the basics of this issue to the general public (a 60-second ad in the Super Bowl is certainly with Apple’s capabilities, for one example) while making full iCloud encryption an opt-in option for Apple product users. It should be very clearly stated that if you enable iCloud encryption and then lose/forget your password and fail to have set up a method to reset it in a trusted manner, you will be SOL. This is why Apple has dithered so long on this, we believe (not because the FBI asked Apple to hold off on iCloud encryption; we agree with Daring Fireball’s John Gruber that something is off about that recent Reuters report).

For Apple to live up to their promises of privacy and security, Apple MUST offer full encryption of iCloud data as an option for users who understand what that entails and who wish to retain full ownership of their data stored in iCloud backups.

There is no such thing as a secure “master key” or “backdoor.”

Those who would give up essential liberty to purchase a little temporary safety deserve neither liberty nor safety. – Benjamin Franklin, Historical Review of Pennsylvania, 1759


Why don’t these genius politicians next attempt to legislate in purple unicorns? They’re equally as plentiful as secure backdoors.MacDailyNews, October 3, 2018


This is not about this phone. This is about the future. And so I do see it as a precedent that should not be done in this country or in any country. This is about civil liberties and is about people’s abilities to protect themselves. If we take encryption away… the only people that would be affected are the good people, not the bad people. Apple doesn’t own encryption. Encryption is readily available in every country in the world, as a matter of fact, the U.S. government sponsors and funds encryption in many cases. And so, if we limit it in some way, the people that we’ll hurt are the good people, not the bad people; they will find it anyway. — Apple CEO Tim Cook, February 2016

8 Comments

      1. Sorry, but even with a court order is insufficient to unlock an iPhone which the owner will not unlock. Who is going to force the owner to reveal their passcode? 256bit AES encryption is unbreakable except by brute force, which can take more time than the universe will last, even with the fastest conceivable supercomputers. In fact, it’s been calculated that the power consumption required to try all possible encryption keys for one device exceeds all available energy by the time you’d finish.

        Do you suggest torture or threats to a family member to force the user to unlock access? Even if you merely attack the passcode itself, a simple seven alphanumeric plus symbols passcode from a set of 90 characters would require over fourteen thousand years for an average solve attempt.

        The Chinese government got around Apple’s total device encryption by forcing all users to have an unencrypted cloud backup of all data on their devices and requiring automatic daily backups, to which the government could have easy search warrant access.

        That was the reason China changed their Chinese government laws in 2016 to require all Cloud storage inside Chinese borders and the encryption keys to Chinese Apple iCloud stored inside China. That way, although Apple retained the iCloud encryption keys in Apple’s offices, they had to be available to Chinese authorities.

      2. Swordmaker, my point was just that the iPhone backdoor issue is a completely different question than illegal government spying on its citizens. They are both huge problems, but not the same problem. Spying is a political question, while backdoors are a scientific problem. Confusing the two leads to politicians demanding scientific impossibilities like “secure back doors.” It has also led to some rather misguided discussions here.

        1. Misguided? Here, in the vaunted MDN cloisters – where penetrating intellects seek to explain, with infinite patience and perfect clarity, everything there is to know to the witless? Surely you exaggerate..

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.