A Washington nonprofit group, the Coalition for a Safer Web, sued Apple in federal court Sunday, demanding that it remove Telegram, a chat and social media app, from its app store for failing to crack down on “violent, extremist conversation.”
[The Coalition for a Safer Web] complained about Telegram’s role in hosting white supremacist, neo-Nazi and other hateful content, and argued in the lawsuit that such content puts Telegram in violation of Apple’s terms of service for its app store. A similar suit is planned against Google, said the coalition’s lawyer, Keith Altman.
“Telegram stands out by itself as the superspreader [of hateful speech], even compared to Parler,” [the coalition’s president, Marc] Ginsberg said in an interview.
Ginsberg, who is Jewish, asserts in the suit that Telegram’s anti-Semitic content puts him in peril and that his ownership of an iPhone gives him standing to sue Apple in federal court to require that the company enforce its terms of service barring hate speech and incitement to violence on apps carried by the App Store.
The suit, filed in U.S. District Court for Northern California, alleges negligent infliction of emotional distress and violation of the California business code, and seeks unspecified compensatory damages and an injunction requiring Apple to remove Telegram from its app store.
Telegram, which says it operates from Dubai, was developed by Russian Internet entrepreneur Pavel Durov. The app is popular with people who want to keep their communications shielded from autocratic regimes and others seeking online privacy. Durov himself has clashed with the Russian government over censorship and encryption.
But Telegram also has a reputation for being the go-to app of terrorism and hate groups. For years, it was used by Islamic State militants to communicate and spread propaganda, until European police worked with Telegram to take down accounts associated with the group in 2019.
MacDailyNews Take: by SteveJack:
Welcome to the slippery slope, Apple.
If this keeps up, Apple’s own Messages app, end-to-end encrypted and baked into Apple’s operating systems (therefore not subject to App Store rules), could be subjected to similar lawsuits.
You’ll recall that Apple’s App Review Board wrote to the developers of Parler, in part:
We have determined that the measures you describe are inadequate to address the proliferation of dangerous and objectionable content on your app.
Parler has not upheld its commitment to moderate and remove harmful or dangerous content encouraging violence and illegal activity, and is not in compliance with the App Store Review Guidelines… Specifically, we have continued to find direct threats of violence and calls to incite lawless action in violation of Guideline 1.1 – Safety – Objectionable Content…
While there is no perfect system to prevent all dangerous or hateful user content, apps are required to have robust content moderation plans in place to proactively and effectively address these issues.
For these reasons, your app will be removed from the App Store until we receive an update that is compliant with the App Store Review Guidelines and you have demonstrated your ability to effectively moderate and filter the dangerous and harmful content on your service.
Again, Apple’s reasoning for pulling Parler is fine, if applied uniformly.
Beside the fact that Twitter, Facebook, etc. remain on the App Store, I see another issue: Using Apple’s stated reasoning to ban Parler, encryption should therefore be disabled from Apple’s iMessage system and a robust content moderation plan should be put in place or Apple’s Messages app should be removed from the platform.
On February 16, 2016, Apple CEO Tim Cook published “A Message to Our Customers” on Apple.com in which he stated, in part:
For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.
…The U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone… make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control…
In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge… In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable…
We are challenging the FBI’s demands with the deepest respect for American democracy and a love of our country. We believe it would be in the best interest of everyone to step back and consider the implications.
While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.
So, what’s more “dangerous and harmful,” a social network’s app, in this case Parler, that might at times publicly display “dangerous and objectionable content” or an app which is totally opaque like Apple’s Messages app using the end-to-end encrypted iMessage system where “dangerous and objectionable content” can be relayed at will in total secrecy?
Some might say “Oh, but it’s not the same thing! One is a social network where people follow one another and a dangerous message could be relayed to a large group of people, but the other one is just a messaging system!”
Yes, it’s true that Apple’s Messages limits group chat to a maximum number of 32 people. How could 32 people cause any damage? Well, for one example, nineteen hijackers carried out the September 11 attacks. Further, in Messages, information can easily be forwarded to others or copied and pasted into other group messages, making the capacity to send “dangerous and objectionable content” via Apple’s encrypted iMessage service virtually unlimited.
Would someone trafficking in “objectionable content” or planning “dangerous and harmful illegal activity” choose to use a public social network or would they use a completely private system explicitly designed to be impenetrable by anyone outside of the group?
It doesn’t matter what you call it. Any app that allows the passing of messages, Parler, Messages, or even apps with messaging attached like Words With Friends, Fitbit, or many thousands of others – if they don’t have a commitment to moderate and remove harmful or dangerous content encouraging violence and illegal activity, they are not in compliance with Apple’s App Store Review Guidelines.
The fact is that Apple’s Messages app lacks any measures whatsoever to address the same issues of which they accuse Parler of not adequately addressing. Apple’s Messages app is perfectly “inadequate to address dangerous and harmful content” or “illegal activity.” With the Messages app, Apple itself has not upheld any commitment to “moderate and remove harmful or dangerous content encouraging violence and illegal activity,” and, accordingly, is not in compliance with Apple’s own guidelines.
Clearly, with Messages and iMessage end-to-end encryption, there is no way at all to moderate content much less “find direct threats of violence and calls to incite lawless action,” but, it’s highly likely that such threats exist in violation of Apple’s guidelines.
Of course, Apple could claim that “hey, nobody knows for sure, it’s a black box, so Messages is technically A-OK,” but that’d be a weasel’s way out of this ill-constructed box that Apple’s concocted since Messages is obviously a much more effective tool for secretly transmitting “dangerous and harmful content” than is Parler or, for that matter, Twitter and Facebook et al.
Perhaps Apple own apps — baked right into their operating systems, no less — are exempted from complying with App Store Review Guidelines with which select other companies must comply or face expulsion? If so, surely competent antitrust investigators would find this example of self-favoritism interesting, if not evidentiary.
SteveJack is a long-time Macintosh user, web designer, multimedia producer, and contributor to the MacDailyNews Opinion section.
MacDailyNews Note: Today is Martin Luther King Day in the U.S. and the markets are closed. As usual on such trading holidays, we will have limited posting today.