Forensic scientist claims suspicious ‘back doors’ running on every iOS device

“Forensic scientist and author Jonathan Zdziarski has posted the slides (PDF) from his talk at the Hackers On Planet Earth (HOPE/X) conference in New York called Identifying Backdoors, Attack Points, and Surveillance Mechanisms in iOS Devices,” Jason D. O’Grady reports for ZDNet.

“Zdziarski, better known as the hacker ‘NerveGas’ in the iPhone development community, worked as dev-team member on many of the early iOS jailbreaks and is the author of five iOS-related O’Reilly books including ‘Hacking and Securing iOS Applications,'” O’Grady reports. “In December 2013, an NSA program dubbed DROPOUTJEEP was reveled by security researcher Jacob Appelbaum that reportedly gave the agency almost complete access to the iPhone. The leaked document, dated 2008, noted that the malware required ‘implant via close access methods’ (presumably physical access to the iPhone) but ominously noted that ‘a remote installation capability will be pursued for a future release.'”

“In his talk, Zdziarski demonstrates ‘a number of undocumented high-value forensic services running on every iOS device’ and ‘suspicious design omissions in iOS that make collection easier,'” O’Grady reports. “According to one slide the iPhone is ‘reasonably secure’ to a typical attacker and the iPhone 5 and iOS 7 are more secure from everybody except Apple and the government. But he notes that Apple has ‘worked hard to ensure that it can access data on end-user devices on behalf of law enforcement’ and links to Apple’s Law Enforcement Process Guidelines, which clearly spell this out.”

O’Grady reports, “Two solutions for the security conscious are to: a) set a complex passcode, and b) install the Apple Configurator application (free, Mac App Store), set enterprise Mobile Device Management (MDM) restrictions on your device then delete all pairing records (a.k.a. pair locking).”

Read more in the full article here.

[Thanks to MacDailyNews Reader “DavidinMpls” for the heads up.]

17 Comments

  1. I just wrote up an initial analysis of this ‘research’ for some Mac security friends. I’ll spare you the details and simply point out that:

    1) The ‘research’ was done on iOS 4.

    2) Extrapolating his EXTREMELY tenuous conclusions to iOS 5, 6, 7 and 8 is NOT science. It’s mere hit whoring, as we say around here. Such is the state of ‘Publish Or Perish’ bad science these days.

    3) No doubt his, EXTREMELY tenuous conclusions will be evaluated for duplication as well as his unscientific extrapolation of his conclusions to further versions of iOS.

    IOW: NOT of concern at this time.

    What IS of concern: The security of iMessage encryption is wide open to surveillance by any service with Apple’s cooperation. That has been extensively studied and explained in the security community. But note the lack of hit whoring headlines. (o_O) 🙄

    1. 1) not true. Zdziarski has studied and reported on every version of iOS through iOS7.1.

      2) That is YOUR opinion. Zdiarski’s second slide here http://www.zdnet.com/forensic-scientist-identifies-suspicious-back-doors-running-on-every-ios-device-7000031795/ explains how Apple’s encryption hasn’t become adequately secure despite significant work in each release

      3) So instead of spreading the ignorance, let’s allow the researcher to continue his work? Apple does not benefit when fanboys WITHOUT EVIDENCE OF THEIR OWN go on to trash security research that is intended to help Apple improve its platform.

      You don’t have to be concerned, Derek. However, potential CIOs who are considering deploying iOS in their organizations would like to understand _exactly_ how Apple addresses these potential security issues that you, apparently as an individual user, seem happy to ignore.

      Would an Apple fan blog offer unbiased explanation, nor should the wise mobile user ask Apple to prove that they have addressed all these problems & vulnerabilities. Apple is not perfect, so let’s not use the default assumption that there is never a security issue.

      1. Yawn. Sorry to bore everyone, but here we go:

        Did YOU pay for and read the Jonathan Zdziarski research paper? NO.

        Did Jason D. O’Grady pay for and read the Jonathan Zdziarski research paper? NO.

        Did any of my Mac research friends or I pay for and read the Jonathan Zdziarski research paper? NO.

        All anyone has bothered with is his slide show:
        http://www.zdziarski.com/blog/wp-content/uploads/2014/07/iOS_Backdoors_Attack_Points_Surveillance_Mechanisms.pdf

        IF anyone verifies that “undocumented services running on every iOS device” is a fact, at THAT point his research will be OF CONCERN. Meanwhile, it’s SPECULATION.

        He is quite incorrect in his assessment that “Encryption in iOS 7: Not Much Changed”. That’s a very lazy statement. He makes NO reference in his work that I am aware of to Apple’s paper regarding security on iOS 7.

        http://www.apple.com/ipad/business/docs/iOS_Security_Feb14.pdf

        Whereas, Steve Gibson of the Security Now podcast, who is himself a security software developer, raked through Apple’s documentation over the course of three weeks. The only flaw he found was the afore mentioned iMessage key sharing problem, of which Zdziarski says nothing.

        Is there access to the ‘packet sniffing’ if an iOS users has WPA2 encryption running? NO. Is there access to the ‘packet sniffing’ if an iOS user is connected to a website via SSL/TLS? No.

        YES, everyone wants full documentation of this ‘packet sniffing’ process and what it is for. Why isn’t it documented? Is this another Apple security blunder (of which there are many). OR is this enablement of NSA treasonous spying on US citizens while communicating within the USA? THAT would be of grave concern.

        Meanwhile: NONE of what Zdziarski provided in his slide show is of concern at this time, as I stated.

        Science always requires:
        PROOF
        and
        VERIFICATION from a third party.

        So have fun going al PaRaNOId while the rest of us wait for the details AND the verification.

        Now buzz off.

      2. I will point out that IF the undocumented ‘packet sniffer’ turns out to be a spy hole (‘back door’) deliberately build into iOS it will be VERY bad for Apple. It’s already horrendous that We The People are wise to be entirely untrusting of our own, corrupt government.

        Meanwhile: It’s not worth the paranoia until we get the detailed proof, end the guessing, and get some verification. Is that enough blether on this subject?

        1. I’m with Paul on this. Apple needs to be much more clear in its iOS security especially if this is what Cook wants to sell corporations. OS X has been around long enough to have been analyzed for security, and it still has regular patches. It would be a mistake to assume that iOS offers anywhere near the same level of security.

          On another tangent, Google style ad-driven datamining is getting out of hand on iOS. Apple may be better than most, but all the iCloud, advertising, and datamining in iOS makes it very clear that Apple is not giving the end user OS X-level security, it’s providing all kinds of portals that Apple and 3rd parties gather user info. The documentation on these tracking features is poor. Apple does not offer the end user enough info to know exactly what is going on or when, nor clear opt-out provisions. All it takes is a few coding mistakes in the user-analytics features of iOS and boom, we have a security problem. Just because Apple doesn’t advertise or document this on its consumer-level OS does not mean that Apple is that far ahead of hackers. Do Apple users need to be paranoid? Of course not. But Apple users should take security a lot more seriously than most of them do today. It would be prudent to be less dismissive of security researchers who can, and do, hack Apple products all the time.

          1. I’m dropping out of this contention, but I will thank Paul for waking me up to the fact that the study does indeed focus on iOS all the way through iOS 7, making it far more legitimate than the initial article I read indicated.

            See below for more updated information released via The Register:

  2. Near the end of the article comes the unsurprising bang. “Overall, the otherwise great security of iOS has been compromised… by Apple… by design.”

    The thing is that Apple realizes the value of peace, yeah yeah, it’s not very patriotic so they have to act quietly in the shadows so that hackers from the free world can keep track of what’s going on with organizations within the wannabe terrorist nation (insert your favorite initials here).

    Note: Postings have been compromised. Sent at 4:12 relative time.

  3. Thanks for the translation. Tried to find a link to the source. When the referenced the 2008 data, I guessed this was some old study.

    Another hit piece by the “make apple look bad foundation”.

    1. Because it’s true. MDN knows.

      iOS8 will have less of these vulnerabilities, but not zero. Apple is being less cooperative with the governments of the world, but they are still cooperating.

      It’s very much like politics in the US – extreme right wing makes outrageous demands, center moves right in order to accommodate, that becomes the ‘new normal’. Cycle repeats.

      Surveillance organizations, particularly in the US, are no longer getting the level of buy-in from Apple (and others) that they were before, but that simply means there are now low hurdles where before there were no hurdles. Yet what’s needed are not hurdles, but tall walls, protecting all of us.

      Jobs pushed back. He paid a price. Cook is only now beginning to. With an inauspicious start, where he was more than willing to cooperate more or less fully, it remains to be seen how far he will go to do the right thing now.

  4. The issue is that iOS probably DOES have backdoors. They are not easy to exploit, though, as Apple said to have few months-long queue of authorities that ask to lift devices’ password protection.

  5. Dropping out of the contention above, here is a more useful article about the study:

    http://www.theregister.co.uk/2014/07/21/ios_firmware_contains_packet_sniffer_and_host_of_secret_spying_tools/

    It’s apparent that The Register bothered to buy and read the full report. They also provide some further details from the author AND Apple:

    Zdziarski said he was inspired to delve deeper into iOS security after reading a report in Der Spiegel that the NSA was targeting iOS gadgets and the systems they are paired with. While Zdziarski says he doesn’t want to be sensationalist about his findings, it’s clear Apple owes customers some answers. . . .

    After publication, Apple apparently briefed journalists that the services identified by Zdziarski are not deliberately provided for government agencies to exploit. Instead, they are for “diagnostic” purposes and to allow enterprise IT bods to manage workers’ devices.

    “The problem with this is that these services dish out data (and bypass backup encryption) regardless of whether or not ‘Send Diagnostic Data to Apple’ is turned on or off, and whether or not the device is managed by an enterprise policy of any kind,” Zdziarski responded on his blog.

    Zdziarski’s blog response is here:
    http://www.zdziarski.com/blog/?p=3447

    In a response from Apple PR to journalists about my HOPE/X talk, it looks like Apple might have inadvertently admitted that, in the classic sense of the word, they do indeed have back doors in iOS, however claim that the purpose is for “diagnostics” and “enterprise”. . . .

    I don’t buy for a minute that these services are intended solely for diagnostics. The data they leak is of an extreme personal nature. There is no notification to the user. A real diagnostic tool would have been engineered to respect the user, prompt them like applications do for access to data, and respect backup encryption. Tell me, what is the point in promising the user encryption if there is a back door to bypass it?

    IOW: We now have a PROBLEM. There will be a lot more about this situation in the coming weeks. Apple has some explaining to do.

    1. I don’t buy for a minute that these services are intended solely for diagnostics. The data they leak is of an extreme personal nature. There is no notification to the user. A real diagnostic tool would have been engineered to respect the user, prompt them like applications do for access to data, and respect backup encryption.

      As the saying goes, “do not attribute to malice that which can be adequately explained by incompetence”.

      I read the above and my initial thought was “Oops. Apple messed up, as they have before.”

      Zdziarski’s conclusion from the same data was to go straight to “CONSPIRACEEEEE! APPLE’S IN LEAGUE WITH DA GUBMINT!!!!111”

      ——RM

      1. Your point about incompetence fits right into my theory that most evil doings are done out of stupidity, laziness or cheapness vs maliciousness.

        At this point I’m going to STFU about my POV on Zdziarski’s questions and claims. The pros (which excludes me of course) have the research in hand and will thrash through it. Apple WILL respond and close up the mess in iOS or be branded as NSA butt boys. We live in interesting times, as the old curse goes.

    2. In general it is a lot safer to use Apple products than it its to use the alternatives commonly available. Firstly because of the inherently more secure OS and ecosystem and secondly because of Apple’s reputation.
      That is not to say that using Apple is safe.
      Dangerous analogy to cars: Driving a Volvo or a Saab is safer than driving a Landwind in identical situation. You do, however, still have to do some work yourself.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.