Apple: We intentionally left iOS kernel unencrypted

“When Apple released a preview version of iOS 10 at its annual developers conference last week, the company slipped in a surprise for security researchers — it left the core of its operating system, the kernel, unencrypted,” Kate Conger reports for TechCrunch. “‘The kernel cache doesn’t contain any user info, and by unencrypting it we’re able to optimize the operating system’s performance without compromising security,’ an Apple spokesperson told TechCrunch.”

“Apple has kept the inner workings of the kernel obfuscated by encryption in previous versions of iOS, leaving developers and researchers in the dark,” Conger reports. “Although encryption is often thought to be synonymous with security, the lack of encryption in this case doesn’t mean that devices running iOS 10 are less secure. It just means that that researchers and developers can poke around in the kernel’s code for the first time, and any security flaws will come to light more quickly. If flaws are revealed, they can be quickly patched.”

Conger reports, “Opening up the kernel’s code for inspection could weaken the market for security flaws like the one the FBI is presumed to have used to get into the San Bernardino iPhone.”

Read more in the full article here.

MacDailyNews Take: As we wrote yesterday:

Apple to the government overreachers: Checkmate!

Apple exposes iOS kernel in what could be savvy strategy or major screwup – June 22, 2016


  1. I love that he said ‘the FBI presumed to use’ because, sorry, but looking at the almost endless evidence of FBI incompetence and mendacity, I automatically assume they lied when they said they got to anywhere, anytime.

  2. MDN, please get rid of those “promoted content” links, the ones by MGID. I know you make money off of advertisements, but this is bottom-of-the-barrel crap, fake articles pretending to be something they’re not, when they’re just tricks to get you to click something that you wouldn’t normally click. I think these “articles” cheapen web sites who use them. Even ordinary advertisements are better than these.

  3. How do these two statements jibe?

    … By unencrypting it we’re able to optimize the operating system’s performance without compromising security.

    … Any security flaws will come to light more quickly.

    If the goal actually is to put kernel security to the test, then of course the point is to compromise security. – – Exploding head averted. 😎

    1. Disaster cover-up on the part of Apple. There is no technical reason for Apple to open its iOS kernel when Apple doesn’t even bother to have a reward for identification of exploits. Expect many security updates in the future as Apple attackers take aim.

      1. Your “no technical reason” theory is only correct if there are not a large number of security researchers who find bugs for reasons other than getting a bug bounty.
        But, there are many researchers who do this for reasons other than getting a bug bounty.
        So much for your theory. Maybe it was a mistake, maybe it wasn’t, but there ARE reasons why Apple might have done this deliberately.

        1. Krioni, I suggest you read the book “Future Crimes”, which provides a revealing discussion of the pitfalls and huge growth in digital crime activity — much of it state-sponsored. Apple is a huge target and no matter how much they talk about privacy, the reality is that the hackers are incessantly poking around for exploits to infiltrate and steal data. Apple is better than some, but far from perfect.

        2. I completely agree with you, MacUser: Apple is not perfect, the threats from criminals and state actors is real, and Apple is a huge target.
          My point was that Mike’s argument was seriously flawed and didn’t support his conclusion. He said “_no_ technical reason for Apple to open its iOS kernel when Apple doesn’t even bother to have a reward for identification of exploits.” [emphasis added] That is incorrect. Multiple security experts pointed out _technical reasons_ that it was arguably a good idea for Apple to do what it did, before Apple announced that the non-encryption was intentional.
          So, perhaps Apple did it intentionally, perhaps they didn’t. We can’t prove or disprove that based on the fact that they said it was. But, it is not logical to say that is is _impossible_ for it to have been intentional, which is what Mike argued.

          Personally, I think it was Apple trying to kill the market for secret exploits by making it easier for many researchers to uncover vulnerabilities, instead of some quietly selling them to the state actors your rightful warned about. It will mean more vulnerabilities are found and reported (and thus fixed!) overall, but it will decrease the number that are secretly found and NOT fixed. That’s what Apple really should be worrying about, and it seems like they are. Perhaps they have read the book “Future Crimes” themselves. One can hope.

  4. Itr’s funny how some people hate to be wrong… Those that said it was a mistake on Apple’s part, are now saying this statement by Apple is a just cover up.

    So yes you, a person who has ZERO inside knowledge of the issue, is correct. And Apple, with thousands of developers who’ve performed hundreds of not thousands of builds of the OS made a mistake ONLY on this last final developer preview build before sending it out.

    Get over yourself.

    How about some facts?

    1. The FBI said it used an exploit to break into the phone and is not telling Apple about it.

    2. Apple on its own was obviously not able to find that exploit.

    3. That exploit and others were found even though the kernel has always been encrypted – and other exploits can and will be found regardless.

    4. If the code were to remain encrypted – it is very difficult to track down and find exploits – meaning mostly people looking to make money from any exploit would take the time to dig through the encrypted code.

    5. Having it opened up means many others can now look at the code with very little effort – a lot of those people will be developers and researchers who advocate security and will help plug the holes.

    1. If what you say is true, then this doesn’t speak very highly of the encryption Apple relied on to protect the kernel.

      … and so the best course of action is to just abandon all encryption altogether in the hope that only white hat hackers will find the exploits? No. There are more incentives for bad actors to keep their exploits secret than Apple offers for them to resolve Apple’s problems for the benefit of Apple’s thousands of developers who you think are perfect, even though we all know that Apple’s software quality has declined with increased OS complexity and internal reliance on automated quality checking in order to meet arbitrary ship dates.

      1. Wrong. It seems like you don’t understand what encryption means in this context. The kernel has to be unencrypted at some point in order to execute. THAT is how vulnerabilities were being discovered secretly and sold to, for example, the FBI.
        Straw man argument: “abandon all encryption altogether ”
        No one is saying that Apple is doing that except you. Why would you say that? It is nonsense, and I think you know that.
        Bad actors will still keep their exploits secret. What Apple’s action does here is make it easier for non-bad actors to find and REPORT vulnerabilities that might otherwise only be known secretly.

        This is about Apple tilting the balance in favor of white hats. As vulnerabilities are fixed in the kernel, it doesn’t matter whether it is encrypted or not. Encryption is about obfuscation here, and Apple has reasonably decided openness will work better to reduce or eliminate the number of vulnerabilities secretly known and sold.

        1. That is much more clear, thank you.

          Still, for proprietary OS, it seems odd to have root OS code opened. This is one of the many reasons that Apple users claimed that their OS was superior — now Apple seems to be relying on open source error and bug finding just like Google and open source software projects. But Apple still offers no significant reward for exploit findings.

  5. By opening up the core, Apple seems to be calling the FBI on its hand: “OK FBI, you said that you found an exploit. Now show us where. If not, fold.”

    But I don’t understand how this momentary win would help Apple except on the level of ego, although, if the FBI can’ produce that exploit, Apple can perhaps recover its reputation against FBI’s allegation that Apple produces a iPhone that is less secure than its advertisements say.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.