U.S. Senator Mike Lee of Utah and Colorado Congressman Ken Buck sent a joint letter to the CEOs of Apple, Google, and Amazon to question the companies’ actions in removing the social network Parler from their app stores and web hosting services earlier this year.
According to the letter by Sen. Lee and Rep. Buck, the actions by the three companies occurred in a matter of three days, shielding the service from 15 million users.
“The timing of steps taken against the Parler social network by your companies and that the actions seem to lack any of the procedural fairness typically afforded in the case of an alleged breach of contract create the appearance of close coordination,” the letter states. “These actions were against a company that is not alleged to have violated any law. In fact, information provided by Parler to the House Oversight Committee revealed that Parler was assisting law enforcement even in advance of January 6th.”
MacDailyNews Note: The joint letter to Apple CEO Tim Cook, Google CEO Sundar Pichai, And Amazon CEO Jeff Bezos from Senator Lee and Congressman Buck, verbatim:
Dear Mr. Pichai, Mr. Cook, and Mr. Bezos:
January 8, 2021 marked the start of a series of actions against one small business by three of the largest technology companies in the world.
As detailed in the timeline below, the timing of steps taken against the Parler social network by your companies and that the actions seem to lack any of the procedural fairness typically afforded in the case of an alleged breach of contract create the appearance of close coordination.
According to public sources:
• On January 8, 2021 Apple sent Parler notice of expulsion from the App Store. Parler was provided only 24 hours to remediate. Google sent Parler notice of expulsion from the Play Store, reportedly within hours of Apple’s notice. Parler was not provided a remediation option. Later in the day, Google removed Parler from the Play Store.
• On January 9, 2021 Apple removed Parler from the App Store. Amazon sent Parler a notice of suspension of their cloud services. No remediation option was provided.
• On January 10, 2021 Amazon suspended service to Parler.
In just three days, Apple and Google effectively cut off Parler’s primary distribution channel, and Amazon cut off Parler’s access to critical computing services, leaving the company completely unable to serve its 15 million users. These actions were against a company that is not alleged to have violated any law. In fact, information provided by Parler to the House Oversight Committee revealed that Parler was assisting law enforcement even in advance of January 6th.
Please provide written answers and any related documentation, including e-mail and text messages, to the following inquiries no later than April 15, 2021:
- Please provide the specific provisions of your policies resulting, where applicable, in suspension or expulsion from distribution channels (Apple App Store and Google Play Store) or termination of cloud service.
- Provide a complete history of all changes to policies that govern requirements for content moderation, including changes to definitions of what is acceptable and prohibited speech or conduct.
- How often are businesses reviewed for compliance with your terms? Is this an ongoing process?
- How many businesses were reviewed in 2020? Of the businesses reviewed in 2020, how many were reviewed because of content moderation practices?
- How many were suspended?
- How many were terminated?
- What triggers the review process? Are outside inputs such as news reports used in
- Who is involved during the review process? Is the process independent, or are all
individuals participating in the review employees of the company?
- Who makes the final decision to initiate a review process?
- Detail the process used during a review to evaluate the target of a review, including any
procedures to ensure fair treatment.
- Are there different processes for companies that have been accused of violating laws (i.e.
money laundering, child trafficking) vs. matters related to content moderation?
- Are automated review processes used, and if so, who sets the criteria used by the
algorithm to determine whether a business should be reviewed? If automated review processes are used, please describe how the algorithm functions and the data sources used by the algorithm.
Notice, Cure, Termination, and Appeal Process:
- Are businesses notified before a review process is initiated?
- If a review process identifies potential non-compliance, are businesses notified, or is the
decision to terminate contracts and/or agreements automatic?
- Does any notification of non-compliance also include a period of time for remediation to
avoid suspension or termination?
- Is there an appeal process for businesses notified of potential non-compliance? If yes,
- Is there an appeal process for businesses notified of suspension? If yes, please describe.
- Is there an appeal process for businesses notified of termination? If yes, please describe.
- Who makes the final decision on appeals? Is the appeal review independent? Are
outside experts consulted, or are all of the individuals involved employees of the company?
- List all businesses terminated/removed since 2017 as a result of content moderation policy violations, the date of their first notice and final termination/removal. How many of these businesses were in social media?
- Was Parler given notice of the potential violation? Was the same amount of time offered to Parler to cure any potential policy violations as is given to other potential violators?
- Who determined the amount of time, if any, provided for Parler to take remediation measures?
- What was the basis for suspension or removal given to Parler in the initial notice?
- Who at your respective companies made the final decision to suspend or terminate
Parler’s contracts and agreements?
- Was the final decision made outside the standard process by company leadership? If so,
who made the decision?
- Did the final decision include input from the company’s political or media relations
advisors, in house or external? If so, who?
- Was anyone outside the company consulted about the decision before it was made?
- Was anyone outside the company informed of the decision before it was made public?
- Was anyone in the media provided with embargoed notice of the termination?
Coordination of Action:
- Apple, Google and Amazon each took actions related to Parler within hours and days of each other. Were these actions taken independent of each other?
- Were there any contacts between any of your companies prior to the action against Parler? If so, with whom?
- Was notice of pending action against Parler shared among your companies?
- Was there any effort to sequence public announcements about your action against Parler?
Senate Judiciary Subcommittee on Competition Policy, Antitrust & Consumer Rights
House Judiciary Subcommittee on Antitrust, Commercial & Administrative Law
MacDailyNews Take: Again, Apple’s stated reasoning for pulling Parler is fine, if applied uniformly.
But, Twitter, Facebook, etc. have never been suspended from Apple’s App Store. Why?
Why are the same sort of comments okay to exist on some services, but considered a ban-able offense – to the tune of the entire app – on others? What’s the difference? What does it reveal about those at Apple who are ultimately making the decisions when it’s ban-able “hate speech” on one platform, but left to fester for 6.5 years (and counting) on others?
Anyone who claims that Twitter, Facebook, etc. have robust content moderation in place that effectively removes harmful or dangerous content that encourages violence and illegal activity is either ignorant or lying.