Apple suspends Siri response grading program after privacy concerns

Reuters:

Apple Inc. said on Friday it suspended its global program where it analyzed recordings from users interacting with its voice assistant Siri, after some privacy concerns were raised about the program.

“While we conduct a thorough review, we are suspending Siri grading globally,” an Apple spokeswoman said in a statement, adding that in a future software update, users will be able to opt out of the program.

In an effort to perform quality checks and improve the voice assistant’s responses, contractors graded Siri’s answers to user queries, The Guardian reported. They also looked at whether the response was triggered accidentally, without a deliberate query from the user, the newspaper said.

MacDailyNews Take: Apple’s statement, verbatim: “We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”

Apple’s stance on privacy should be sacrosanct, but not to the detriment of quality. Allowing users to opt-in to help improve Siri is a good solution.

7 Comments

  1. Cue all the folks who will be complaining in a few months that Siri doesn’t seem to be improving. Opt out is a good notion, but the data was already anonymized. Where was all the outrage against the other voice assistants that were doing even more listening without any privacy measures?

    1. Exactly. I’m concerned about privacy, but I’m more worried about government eavesdropping and surveillance where they DO monitor who you are.

      I wouldn’t trust Google any more than Facebook at this point.

      However I think the hysteria concerning the Apple system was a bit alarmist.

      1. It only seems alarmist after the coverage of Alexa and Google Assistant being accused of the same a few weeks ago. People, especially iOS users, gained a false sense of security that Siri wasn’t one of the group. This week users feel their trust was misplaced. That difference between what users ‘expect’ of Google/Amazon’s assistants vs Siri makes it ‘feel’ worse. Apple just had a lot more to live up to, partly because they keep advertising their Security/Privacy as being so much better. The bit about how Apple Watch automatically starts listening by default w/o a trigger word didn’t help matters.

  2. The SIRI grading recordings are completely anonymous! There is no privacy issue.

    The press is creating a false equivalence between SIRI grading and the things Google and Amazon are doing…

    Yet another case of bad tech journalism. Things like this help the Trump-hole get away with his bogus claims of fake news. The press needs to hold itself to a higher standard.

    1. Unless you are actually familiar with the process each company takes in securing and using audio files generated from their respective Assistants, you shouldn’t be accusing anyone of false equivalencies or bad tech journalism.

      It is good to see Apple admitting that they too use audio recordings to better Siri. However keeping it mum when others are accused of the same seems ‘dishonest’ for the face Apple displays publicly. Following Google and Amazon now with a system to give the user a choice to ‘participate’ is a good first step. Having the choice an ‘opt-in’ vs Google/Amazon’s ‘opt-out’ may help towards rebuilding trust.

      1. Rereading the article excerpt and MDN above it seems Apple is also doing an ‘opt-out’ and it was actually MDN’s suggestion for an ‘opt-in’.

  3. I might sound like a Luddite but the easiest way of avoiding these issues is avoid using voice assistants. If you don’t use them then you don’t have to worry about this (particular) privacy issue.

Leave a Reply to TheloniousMac Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.