“Many people have grown accustomed to talking to their smart devices, asking them to read a text, play a song or set an alarm,” Craig S. Smith reports for The New York Times. “But someone else might be secretly talking to them, too.”
“Over the past two years, researchers in China and the United States have begun demonstrating that they can send hidden commands that are undetectable to the human ear to Apple’s Siri, Amazon’s Alexa and Google’s Assistant,” Smith reports. “Inside university labs, the researchers have been able to secretly activate the artificial intelligence systems on smartphones and smart speakers, making them dial phone numbers or open websites. In the wrong hands, the technology could be used to unlock doors, wire money or buy stuff online — simply with music playing over the radio.”
“‘We wanted to see if we could make it even more stealthy,’ said Nicholas Carlini, a fifth-year Ph.D. student in computer security at U.C. Berkeley and one of the paper’s authors,” Smith reports. “Mr. Carlini added that while there was no evidence that these techniques have left the lab, it may only be a matter of time before someone starts exploiting them. ‘My assumption is that the malicious people already employ people to do what I do,’ he said.”
“Amazon said that it doesn’t disclose specific security measures, but it has taken steps to ensure its Echo smart speaker is secure. Google said security is an ongoing focus and that its Assistant has features to mitigate undetectable audio commands. Both companies’ assistants employ voice recognition technology to prevent devices from acting on certain commands unless they recognize the user’s voice,” Smith reports. “Apple said its smart speaker, HomePod, is designed to prevent commands from doing things like unlocking doors, and it noted that iPhones and iPads must be unlocked before Siri will act on commands that access sensitive data or open apps and websites, among other measures.”
Read more in the full article here.
MacDailyNews Take: Great. Subliminal malware… or “mal-audio.” Wonderful.