Telegram iOS app removed from App Store last week due to child pornography

“We now have the answer for why the popular messaging app Telegram was pulled from the App Store last week,” Zac Hall reports for 9to5Mac. “Telegram for iOS notably disappeared from the App Store for several hours without an explanation before the service’s CEO blamed the problem on Apple pulling the app due to ‘inappropriate content’ appearing in the app.”

“According to an email shared by a reader that includes a response from Phil Schiller who manages the App Store, Telegram was abruptly pulled when Apple learned that the app was serving child pornography to users,” Hall reports. “Telegram offers a secure messaging feature that relies on end-to-end encryption for protecting the privacy of messages sent between users. This means the illegal content was likely not simply media being shared between users but more likely content being served up from a third-party plug-in used by Telegram.”

The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps. After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children). – Apple SVP Phil Schiller

Hall reports, “Within hours of Telegram being pulled, the secure messaging app returned to the App Store with fixes in place to prevent the illegal content from being served to users.”

Read more, and see the full email from Schiller, here.

MacDailyNews Take: Zero tolerance. Apple did an excellent job by taking swift action as soon as the issue was identified.


  1. If more companies took this sort of proactive approach to the criminal misuse of secure communications technology, there would be much less pressure from governments to provide ways around the security.

        1. It’s not privacy that’s concerning me in this instance. It’s the rule of law. We are a representative democracy whith specific procedures on enacting and enforcing the law, and it’s NOT by corporations. Nobody elects corporations, and it’s a very slippery slope.

          Now, it can be that in this instance Apple was acting through cooperation with law enforcement. I don’t know.

          But shouls Apple kill an iPhone signal on a suspected drug deal? I don’t thinks so, that’s overstepping, unless they are working with law enforcement.

  2. I’m not a developer (although I did stay at a Holiday Inn Express last night)… question….

    If the messages are encrypted. how did they know the content of the material being sent?

    Regardless, Kudos to Apple for taking such swift action. Although one must wonder why Telegram left this vulnerability in the first place?

    1. Agreed, I also don’t understand what the issue could have been.

      Encrypted messaging between individuals (or even from a bot/server attached to the system) should not have been discoverable by Apple or anyone else, and the App Store does not remove apps due to misuse, right? E.g., their own Messages app could certainly be used to send illegal content. Doesn’t seem like a vulnerability/violation in the app itself warranting removal from the store.

      More to come?

  3. Apple makes some missteps and the media is fond of jumping on those and overhyping them with FUD. How about reporting on this action by Apple – swift and effective!

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.