Apple contractors ‘regularly hear confidential details’ on Siri recordings including medical details and people having sex, says whistleblower

Alex Hern for The Guardian:

Apple contractors regularly hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or “grading”, the company’s Siri voice assistant, the Guardian has learned.

Although Apple does not explicitly disclose it in its consumer-facing privacy documentation, a small proportion of Siri recordings are passed on to contractors working for the company around the world. They are tasked with grading the responses on a variety of factors, including whether the activation of the voice assistant was deliberate or accidental, whether the query was something Siri could be expected to help with and whether Siri’s response was appropriate.

Apple says the data “is used to help Siri and dictation … understand you better and recognise what you say”. But the company does not explicitly state that that work is undertaken by humans who listen to the pseudonymised recordings.

A whistleblower working for the firm, who asked to remain anonymous due to fears over their job, expressed concerns about this lack of disclosure, particularly given the frequency with which accidental activations pick up extremely sensitive personal information… The whistleblower said: “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”

That accompanying information may be used to verify whether a request was successfully dealt with. In its privacy documents, Apple says the Siri data “is not linked to other data that Apple may have from your use of other Apple services”. There is no specific name or identifier attached to a record and no individual recording can be easily linked to other recordings.

MacDailyNews Take: The article states that most of these accidental Siri activations occur on Apple Watch. With watchOS 5 and Apple Watch Series 3 or later, you don’t need to say “Hey Siri” to get Siri’s attention. You simply raise your wrist to wake your Apple Watch and say what you need.

Here’s how to turn the Raise To Speak feature off or on:

  1. Open the Settings app on your Apple Watch.
  2. Tap General > Siri.
  3. Turn Raise To Speak off or on.


  1. I haven’t used Siri since the first year it debuted (I played with it a bit out of the novelty at first) not out of privacy concerns, but because it simply isn’t terribly useful. Outside of tech or geek circles, I don’t think many over the age of 30 or 35 do. I have quite literally never seen Siri use outside of a home, not on the street, not on the subway. I think voice assistants are being boosted by the same hype train as contactless payment, AR, and ‘AI’. They have none of my conversations because I simply don’t use it. My productivity is better without having to interact with technology that way – it actually ends up being an encumbrance. There has been so little real innovation in the valley for the past decade. 😐😐

  2. All these years later and I only use Siri for putting appointments into my calendar. I can think of a lot of other things Siri should be helpful for, but it keeps teaching me not to bother asking.

  3. not directly related to this, but it also probably wouldn’t be a terrible idea if Apple and other manufacturers had some kind of tone places could play that disables digital assistants from automatically listening

    I think Amazon has something like this for Alexa that they use to prevent activation during commercials.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.