“Tens of millions of people use smart speakers and their voice software to play games, find music or trawl for trivia. Millions more are reluctant to invite the devices and their powerful microphones into their homes out of concern that someone might be listening,” Matthew Day, Giles Turner, and Natalia Drozdiak report for BloombergQuint. “Sometimes, someone is.”

“Amazon.com Inc. employs thousands of people around the world to help improve the Alexa digital assistant powering its line of Echo speakers. The team listens to voice recordings captured in Echo owners’ homes and offices. The recordings are transcribed, annotated and then fed back into the software as part of an effort to eliminate gaps in Alexa’s understanding of human speech and help it better respond to commands,” Day, Turner, and Drozdiak report. “The team comprises a mix of contractors and full-time Amazon employees who work in outposts from Boston to Costa Rica, India and Romania, according to the people, who signed nondisclosure agreements barring them from speaking publicly about the program. They work nine hours a day, with each reviewer parsing as many as 1,000 audio clips per shift, according to two workers based at Amazon’s Bucharest office.”

“One worker in Boston said he mined accumulated voice data for specific utterances such as ‘Taylor Swift’ and annotated them to indicate the searcher meant the musical artist. Occasionally the listeners pick up things Echo owners likely would rather stay private: a woman singing badly off key in the shower, say, or a child screaming for help,” Day, Turner, and Drozdiak report. “Two of the workers said they picked up what they believe was a sexual assault. When something like that happens, they may share the experience in the internal chat room as a way of relieving stress. Amazon says it has procedures in place for workers to follow when they hear something distressing, but two Romania-based employees said that, after requesting guidance for such cases, they were told it wasn’t Amazon’s job to interfere.”

Read more in the full article here.

MacDailyNews Take: Most of us here suspected as much from the get go, you know, because we think, and we know just how horribly user privacy has been handled by some companies in the past.

Most distressing is that, according to Bloomberg, “the recordings sent to the Alexa auditors do not provide a user’s full name and address but are associated with an account number, as well as the user’s first name and the device’s serial number.”

So, these recordings are perfectly identifiable. Users should think about where they have their Amazon Echo and Alexa-capable devices in their homes and offices and what they could hear, record, and transmit.

Of note, Bloomberg also reports:

Apple’s Siri also has human helpers, who work to gauge whether the digital assistant’s interpretation of requests lines up with what the person said. The recordings they review lack personally identifiable information and are stored for six months tied to a random identifier, according to an Apple security white paper. After that, the data is stripped of its random identification information but may be stored for longer periods to improve Siri’s voice recognition. At Google, some employees can access some audio snippets from its Assistant to help train and improve the product, but it’s not associated with any personally identifiable information and the audio is distorted, the company says.

SEE ALSO:
Bad news for those who want Facebook listening in their homes: Facebook delays smart speaker debut amid data privacy scandal – March 29, 2018
Google Home Mini spies on everything reviewer said 24/7, transmits recordings to Google’s servers – October 12, 2017
Amazon Echo murder case spotlights question of what ‘always on’ actually means – December 28, 2016
Apple: Hey Siri and Live Photos data stays only on your device to ensure privacy – September 12, 2015