Apple apologizes for Siri grading program, makes changes to improve Siri’s privacy protections

Apple statement, August 28, 2019:

At Apple, we believe privacy is a fundamental human right. We design our products to protect users’ personal data, and we are constantly working to strengthen those protections. This is true for our services as well. Our goal with Siri, the pioneering intelligent assistant, is to provide the best experience for our customers while vigilantly protecting their privacy.

We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process — which we call grading. We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies. We’ve decided to make some changes to Siri as a result.

How Siri Protects Your Privacy

Siri has been engineered to protect user privacy from the beginning. We focus on doing as much on device as possible, minimizing the amount of data we collect with Siri. When we store Siri data on our servers, we don’t use it to build a marketing profile and we never sell it to anyone. We use Siri data only to improve Siri, and we are constantly developing technologies to make Siri even more private.

Siri uses as little data as possible to deliver an accurate result. When you ask a question about a sporting event, for example, Siri uses your general location to provide suitable results. But if you ask for the nearest grocery store, more specific location data is used.

If you ask Siri to read your unread messages, Siri simply instructs your device to read aloud your unread messages. The contents of your messages aren’t transmitted to Siri’s servers, because that isn’t necessary to fulfill your request.

Siri uses a random identifier — a long string of letters and numbers associated with a single device — to keep track of data while it’s being processed, rather than tying it to your identity through your Apple ID or phone number — a process that we believe is unique among the digital assistants in use today. For further protection, after six months, the device’s data is disassociated from the random identifier.

In iOS, we offer details on the data Siri accesses, and how we protect your information in the process, in Settings > Siri & Search > About Ask Siri & Privacy.

How Your Data Makes Siri Better

In order for Siri to more accurately complete personalized tasks, it collects and stores certain information from your device. For instance, when Siri encounters an uncommon name, it may use names from your Contacts to make sure it recognizes the name correctly.

Siri also relies on data from your interactions with it. This includes the audio of your request and a computer-generated transcription of it. Apple sometimes uses the audio recording of a request, as well as the transcript, in a machine learning process that “trains” Siri to improve.

Before we suspended grading, our process involved reviewing a small sample of audio from Siri requests — less than 0.2 percent — and their computer-generated transcripts, to measure how well Siri was responding and to improve its reliability. For example, did the user intend to wake Siri? Did Siri hear the request accurately? And did Siri respond appropriately to the request?

Changes We’re Making

As a result of our review, we realize we haven’t been fully living up to our high ideals, and for that we apologize. As we previously announced, we halted the Siri grading program. We plan to resume later this fall when software updates are released to our users — but only after making the following changes:

• First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.

• Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.

• Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.

Apple is committed to putting the customer at the center of everything we do, which includes protecting their privacy. We created Siri to help them get things done, faster and easier, without compromising their right to privacy. We are grateful to our users for their passion for Siri, and for pushing us to constantly improve.

For more information: Siri Privacy and Grading

MacDailyNews Take: Allowing users to opt-in to help improve Siri is an excellent idea (if we don’t say so ourselves)!

Apple’s stance on privacy should be sacrosanct, but not to the detriment of quality. Allowing users to opt-in to help improve Siri is a good solution. — MacDailyNews, August 2, 2019

10 Comments

  1. As for anyone who’s ever been caught, they’re only sorry, for being caught. Apple ignored Siri for the longest time. Apple just found it easier to record their customers conversations and feed it to Siri. Like a parent throwing their child in front of the TV.

    1. You are a well-known on this forum for your negative, cynical outlook, bjr001. But I see no reason to paint Apple with the same brush as most other corporations. It is just as wrong to equally smear everyone as it is to equally excuse everyone.

      Apple is a rarity in terms of its culture and commitment to the privacy of the end user. Apple is not like Google, Facebook, Amazon, or many other companies whose primary business revolves around aggregating and monetizing user data. Apple is not perfect and I agree that one has to be careful about trusting a corporation. The corporation is only as good as the weakest and most corrupt person in the upper management structure. But Apple doesn’t just talk a good game, it implements its consumer protection policies in its products.

      Read the Apple statement above. Read what Apple has posted on its website. Consider its product design – the Secure Enclave chip, improvements to user authentication, etc. Apple is different and seeks to continuously improve as an organization.

  2. “MacDailyNews Take: Allowing customer to opt-in to improve Siri is an excellent idea (if we don’t say so ourselves)!”

    Apple makes more than enough money to be able to throw resources at Siri. Instead they chose to take the cheap and throwing privacy to the wind approach by recording their customers as a way to improve Siri.

    1. I disagree with your skepticism. Apple’s approach was not optimal and that was brought to management’s attention. Now it is better. Ideally Apple would have implemented the new strategy from the start. But they did not. That was a mistake.

      If your negative feelings about Apple are this strong, I can only imagine how you must feel about Google. Think about all of the data that Google has collected about you – reading every Gmail ever sent (or received!) by you, collecting all of your search data, map queries, websites visited, where you went in those websites, your location, etc. If you want to fix things, then start with the biggest area of corruption and work your way down. Apple is far from the top.

      1. When Jobs came back to Apple, things were not good. You can say Apple was heading down for the count. Jobs decision of contacting MS and accepting the $150 million dollar investment from Gates saved the company. It was Steve’s decision. Many were against it. Later Jobs cleaned up the product line including removing products. Many people were also against it.

        All this made Apple to what it is today. You and others are the mind set that want to gloss over issues that can become problems later for the sake of short term gain. I and others are of Jobs mindset who want to ensure Apple success and catch issues early on and produce quality products.

  3. No company that “views privacy as a fundamental human right” would have saved Siri recordings and allowed contractors to listen to them without the customer’s consent or knowledge.

    Apple is upset they got caught.
    Apple is far better regarding customer privacy than google or facebook, but that’s like saying you’re good because you’re not as evil as the other guys.
    Fail.

  4. The entire point of having humans listen to and transcribe the audio in these cases is to help correct the respective AIs’ errors in parsing human speech. Apple has long touted their AI chip allowing on-device processing. Why not allow users themselves to review audio recorded on their devices by Siri and correct them. make Siri a bit smarter and anonymously upload that to Apple for improving Siri in general? This may be the best solution to keeping user audio ‘private’ while still allowing Siri to ‘learn’. Apple might be squandering this AI ‘edge’ they currently have based on the announcements made at Google i/O 2019 back in May.

    https://www.designnews.com/electronics-test/google-bringing-ai-edge-everyone-2019/1500467160761?ADTRK=UBM&elq_mid=8599&elq_cid=1418267

    1. Apple has no excuse to do this with recordings from customers who did not opt in.

      If it cannot do that, then there should be a law. A company the size of Apple should be able to run this completely in-house using employee devices.

      No excuses.

      1. Did you read past my first sentence? My point was to make it users’ choice to participate and restrict all audio files to remain on users’ devices obviating any opt-in. Apple would have no chance to access/gain anything unless the user participates. Siri ‘learning’ then would be the user community’s responsibility with Apple just improving Siri tech.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.