Apple vs. GrayKey: Leaked emails expose the fight for your iPhone privacy

“In the fight over digital privacy, Apple is forever adding layer on layer of security to its iPhones. For most users, Apple’s approach is a great boon, keeping all their information away from thieves and hackers. But for America’s cops, it’s causing a headache, preventing them getting into iPhones where they could find valuable and timely information,” Thomas Fox-Brewster reports for Forbes. “That’s why police are increasingly turning to private contractors like GrayShift, which Forbes uncloaked earlier this year as it promised to hack its way into the latest Apple cellphones.”

“Is either side winning? From emails leaked to Forbes, and in conversations with police officials, it would appear on first glance that Apple’s latest updates to its iOS operating system truly have stymied the cops and their GrayKeys,” Fox-Brewster reports. “But, at the same time, police still have a way to hack into iPhones, even the latest models, the emails show. The messages were shared by an anonymous source who had access to a private email list subscribed to by members of the police and digital forensics communities.”

“In one email, dated July 19, a detective at the Oklahoma City Police Department wrote: ‘We have run into our first phone with 11.4.1 yesterday and it’s confirmed. Plugging the device into the GrayKey results in the phone signalling that it’s charging but they GrayKey does not recognize that a phone is plugged into it,'” Fox-Brewster reports. “After the Oklahoma City detective’s comment, forensics professionals on the email thread started discussing a way to circumvent USB Restricted Mode, as recently revealed by Russian company Elcomsoft. The bypass is remarkably simple: connecting an iPhone to an accessory device, such as a $39 Apple camera adapter, will reset the one-hour timer.”

MacDailyNews Take: Presto, another avenue for Apple to close post haste. And iOS gets even more secure!

Read more in the full article here.

MacDailyNews Take: Always use long, custom, alphanumeric passcodes. Use at least seven characters – even longer is better – and mix numbers, letters, and symbols.

To change your passcode in iOS:
Settings > Face ID & Passcodes > Change Passcode > Passcode Options: Custom Alphanumeric Code

SEE ALSO:
Apple punches back at Grayshift, maker of the GrayKey iPhone-hacking box – June 15, 2018
Cops are confident Grayshift iPhone hackers already have a workaround to Apple’s new iOS security feature – June 15, 2018
Apple to undercut GrayShift, Cellebrite tools for cracking iPhones – June 13, 2018
Apple’s iOS 11.4 update with ‘USB Restricted Mode’ may defeat GrayKey, Cellebrite forensic tools – May 8, 2018
GrayKey box can guess a six-digit iPhone password in 11 hours on average – April 16, 2018
Police around the U.S. can now unlock iPhones – April 12, 2018
Law enforcement uses ‘GrayKey’ box to unlock iPhones – March 16, 2018
The man who wrote those password rules has a new tip: N3v$r M1^d! – August 8, 2017

15 Comments

  1. This should be very simple:
    Law Enforcement gets a warrant.
    Apple unlocks the phone.

    What Law Enforcement wants:
    The ability to make you self incriminate without a warrant or probable cause.

    Even the very conservative Roberts SCOTUS told the Donut Patrol to get a warrant.

    1. I believe in the sanctity of personal privacy, even to the extent that it decreases security in general. Personal privacy come at a price, but the cost is ultimately higher if we are willing to sacrifice privacy for (apparent) security.

      It is getting to the point where digital devices like iPhones are becoming extensions of our brains. The Feds can’t pry into my brain (and the Fifth Amendment applies in that case, anyway). I would prefer that the iPhone and similar devices be treated similarly – much of the data on that device is personal and private and I believe that I should have full control over its dissemination. Would this impact law enforcement? Certainly. But law enforcement can acquire a lot of data (location/time, call logs, etc.) without prying into your phone. The fact that the law enforcement job would become more difficult is not a good rationale for a backdoor into my iPhone.

        1. Reminder: The”essential liberty” that Franklin was discussing was the ability of the people’s elected representatives to raise taxes over the objection of the propertied class.

    2. DavGreg, what you say is true. They should get a warrant. However, that is completely irrelevant to this discussion because most of the agencies using GrayKey and the like HAVE warrants or some other properly legal basis to conduct a search. How else do you think the device came into their physical possession? If the search were illegal, they couldn’t use the fruits to get a conviction, so there wouldn’t be any point in wasting the time and money. Suggesting otherwise is nonsense.

      The discussion is actually about whether there should be some means for officers who HAVE a lawful warrant to execute it in accordance with the judge’s order. Currently, cracking a properly-encrypted iPhone takes so much trouble that it just isn’t cost effective to try except in the most important cases. That means that an increasing number of criminals are walking free to find new victims.

      The police want Apple to make access easier by doing exactly what you suggest: unlock the phone when presented with a warrant. To do that, they would have to provide themselves with a back door. The police are not tech experts, so they don’t understand that what they are requesting (easy access with a warrant, but otherwise complete security) is simply impossible. Encryption either works or it doesn’t. A back door that Apple can use will inevitably leak so that anybody could use it. That would empower more crooks than it would stop.

      1. Law enforcement is going on fishing expeditions and seizing cell phones, holding them to hack into and trawling for stuff. This is exactly why Justice Roberts said get a warrant.

        Law Enforcement is using illegal Stingrays and is seizing phones just because they can and this is a direct assault upon your rights of privacy, presumption of innocence and right against self incrimination.

        Probable cause first, search warrant second, then take a look. That is not what is happening in many cases.

      2. DavGreg,

        Overwhelmingly, they DO get a warrant. They don’t need Justice Roberts to tell them so. James Madison and the Bill of Rights said it a long time ago.

        Please list “many cases” of peace officers who are stupid enough to “go on fishing expeditions and seize cell phones” illegally, knowing full well that none—zero, zilch, nada—of the evidence obtained (either directly or as indirect fruits of the search) will be usable in court. Indeed, an illegal search might make it impossible to later obtain a lawful warrant.

        There may be a few peace officers out there who get some sort of sexual thrill from violating citizens’ rights, but most of them are more interested in getting bad guys off the street.

        1. “Overwhelmingly?” You should be ashamed at your strong promotion of constitutional rights to privacy while being satisfied at “overwhelmingly.”

        2. What makes you think I’m satisfied? One bad cop is too many. That said, pointing out that soccer moms overwhelmingly don’t rob banks does not exonerate those who do. It does suggest that condemning all soccer moms as bank robbers is as misplaced as condemning all peace officers as data thieves.

    3. better:

      Owner of phone has only encryption key (perhaps biometric). Apple cannot decrypt anything, ever, on device or in clouds. Aws, google, isp, and others cannot get their grubbing hands on anybuswr data either. Middleman attacks and device theft also thwarted. For law enforcement to access device or backup server, warrant is required and Apple should be required to assist in unlocking in the presence of the user if necessary. Noncooperation of user or device maker results in fines and jail time for contempt of court.

      This guarantees user privacy, which Apple currently does not do since they hold all encryption keys today on iOS devices.

      The only way to be secure and private is to have the encryption key in your own possession, and nowhere else. No matter what Apple claims.

      1. Apple doesn’t hold ANY encryption keys on iOS devices. iCloud is another thing entirely.

        I notice that you are the third poster in this thread who wants privacy but has no problem with allowing the government to compel access to the passcodes.

  2. Hackers will always be able to find an exploit. The counter is that once an exploit is exposed (hackers can’t help themselves from giving the secret away) Apple closes it.

    All exploits are of short duration (unless the target is a MSFT product).

    1. Searching the device (in the US) involves the Fourth Amendment, which has exceptions for certain reasonable searches, as with a proper warrant. Forcing someone to divulge their passcode involves the Fifth Amendment, which has no such exceptions. You can’t pass a law to overrule the Constitution.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.