“Siri may be your personal assistant. But your voice is not the only one she listens to,” Andy Greenberg reports for Wired. “As a group of French researchers have discovered, Siri also helpfully obeys the orders of any hacker who talks to her—even, in some cases, one who’s silently transmitting those commands via radio from as far as 16 feet away.”
“A pair of researchers at ANSSI, a French government agency devoted to information security, have shown that they can use radio waves to silently trigger voice commands on any Android phone or iPhone that has Google Now or Siri enabled, if it also has a pair of headphones with a microphone plugged into its jack,” Greenberg reports. “Their clever hack uses those headphones’ cord as an antenna, exploiting its wire to convert surreptitious electromagnetic waves into electrical signals that appear to the phone’s operating system to be audio coming from the user’s microphone.”
Greenberg reports, “Without speaking a word, a hacker could use that radio attack to tell Siri or Google Now to make calls and send texts, dial the hacker’s number to turn the phone into an eavesdropping device, send the phone’s browser to a malware site, or send spam and phishing messages via email, Facebook, or Twitter.”
“The ANSSI researchers say they’ve contacted Apple and Google about their work and recommended other fixes, too: They advise that better shielding on headphone cords would force attackers to use a higher-power radio signal, for instance, or an electromagnetic sensor in the phone could block the attack,” Greenberg reports. “But they note that their attack could also be prevented in software, too, by letting users create their own custom ‘wake’ words that launch Siri or Google Now, or by using voice recognition to block out strangers’ commands.”
Read more in the full article here.
MacDailyNews Take: Custom wake words for Siri would be very welcome regardless.