iPhone developer: Security hole could affect all iPhones, not just jailbroken ones

Black Friday/Cyber Monday Apple Blowout“The popular Apple iPhone smartphone may be at risk from a security vulnerability that affects even those models that haven’t been hacked, or ‘jailbroken,’ according to new findings from a Swiss software engineer,” Andy Patrizio reports for eSecurityPlanet.

“Nicolas Seriot, an iPhone developer, presented his findings during a conference in Geneva on iPhone privacy. According to his research, malware could exploit a previously unknown hole to access a user’s e-mail accounts, Safari, and YouTube searches, keyboard cache content, and the Wi-Fi connection logs,” Patrizio reports.

“Most hacks that affect the iPhone are the ones that are unlocked with ‘jailbreak’ utilities… Evidently, however, even iPhones fresh off the shelf could be vulnerable, according to Seriot, who showed how a malicious application could gather personal data from an iPhone without using private APIs,” Patrizio reports.

“Based on his conclusions, a malicious app is free to move around all it wants once inside the system — reading a user’s address book, stealing their phone number, viewing their browser history, and culling other private data from the device,” Patrizio reports. “Apple did not respond to requests for comment.”

Patrizio reports, “Seriot also said that unlike the transmission methods popular among PC malware, iPhone trojans will make their way to the device by way of the Apple App Store. ‘Reviewers can be fooled,’ he noted in his presentation.”

Full article here.

MacDailyNews Take: Apple’s response, if they find the threat to users to be credible, won’t likely be a statement to media, but rather the release of iPhone OS 3.1.3 or higher along with a credit for Seriot in the CVE list.

20 Comments

  1. That’s pretty much nonsense. Obviously Obviously any application on any computer can have malicious payload.

    Go to any shareware site, nobody checks code there. So why now claim that iPhone apps have a security hole, when they actually check code there?

    Pure baloney FUD!

  2. Yup, just another example of “if you make it insecure, it’ll be more insecure” thinking without any mention of the likelihood.

    I could be IN YOUR HOUSE RRIIIGHT NOOOOWWWW! No really. Just give me your address and a key to your house. Oh, and once I get there, I’ll let you know, so don’t look at the door for about 5 minutes once I tell you I’m there. Next thing you know I’m standing behind you KEELIN YER DEUDZ!!!

  3. Macguy: you are right, this is pure FUD.

    Even in the worst scenario (a malicious application making its way to the App Store despite the review process), Apple could at any time remove it from the AppStore, delete it remotely from the the user device and go after the malicious developer (every developer has to sign up with the iPhone Developer Program, pay at least $99 a year, and also provide tax and banking information if submitting non-free apps; this is not an anonymous process). This is one of the good reasons why Apple should continue controlling the app submission process…

  4. Apple has a remote-disable feature they’ve never yet used on the iPhone community. Having some rogue application obtained off of Apples App Store suddenly start stealing customers’ private data is probably when Apple would use the “disable” tool.

    The good thing about authors of malware is they are like cockroaches: they work in the dark and require anonymity.

    What retard is going to register with Apple, sign agreements with them, and then write some hidden malware for stealing user data when Apple can A) remotely and globally disable his software, and B) come sue his ass and publicly disclose who he is?

    Answer: no one.

  5. “may be at risk”
    “could exploit”
    “Evidently”
    “could be vulnerable”
    “application could gather”
    “Based on his conclusions”

    Wake me when someone KNOWS something, not all this “could be” crap.

  6. Huh?
    An App that Apple sells me.. can access my phone… and my data.
    That’s a surprise?

    Guess how long that App would stay in Apple’s store?
    Guess how long before the virus-author would be prosecuted?
    (Developers have to give their name and credit-card info before they can be developers.)

  7. Apple tests all Apps during the approval process.

    The chances of malware getting through to an iPhone are slim. The chances of getting through to an iPhone anonymously are approaching impossible.

    FUD

  8. Isn’t the automated iPhone app checking software supposed do just that, check for some hidden code among other things. If you have an app that is downloading content directly from some remote server, it can not load a Trojan horse through their app’s little backdoor.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.