“According to an email shared by a reader that includes a response from Phil Schiller who manages the App Store, Telegram was abruptly pulled when Apple learned that the app was serving child pornography to users,” Hall reports. “Telegram offers a secure messaging feature that relies on end-to-end encryption for protecting the privacy of messages sent between users. This means the illegal content was likely not simply media being shared between users but more likely content being served up from a third-party plug-in used by Telegram.”
The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps. After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children). – Apple SVP Phil Schiller
Hall reports, “Within hours of Telegram being pulled, the secure messaging app returned to the App Store with fixes in place to prevent the illegal content from being served to users.”
Read more, and see the full email from Schiller, here.
MacDailyNews Take: Zero tolerance. Apple did an excellent job by taking swift action as soon as the issue was identified.