Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Thursday September 07 2017, @01:46PM   Printer-friendly
from the careless-whispers dept.

Submitted via IRC for SoyCow1937

Hacks are often caused by our own stupidity, but you can blame tech companies for a new vulnerability. Researchers from China's Zheijiang University found a way to attack Siri, Alexa and other voice assistants by feeding them commands in ultrasonic frequencies. Those are too high for humans to hear, but they're perfectly audible to the microphones on your devices. With the technique, researchers could get the AI assistants to open malicious websites and even your door if you had a smart lock connected.

The relatively simple technique is called DolphinAttack. Researchers first translated human voice commands into ultrasonic frequencies (over 20,000 hz). They then simply played them back from a regular smartphone equipped with an amplifier, ultrasonic transducer and battery -- less than $3 worth of parts.

What makes the attack scary is the fact that it works on just about anything: Siri, Google Assistant, Samsung S Voice and Alexa, on devices like smartphones, iPads, MacBooks, Amazon Echo and even an Audi Q3 -- 16 devices and seven system in total. What's worse, "the inaudible voice commands can be correctly interpreted by the SR (speech recognition) systems on all the tested hardware." Suffice to say, it works even if the attacker has no device access and the owner has taken the necessary security precautions.

Source: https://www.engadget.com/2017/09/06/alexa-and-siri-are-vulnerable-to-silent-nefarious-commands/


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by ese002 on Thursday September 07 2017, @05:49PM (1 child)

    by ese002 (5306) on Thursday September 07 2017, @05:49PM (#564672)

    Why is audio outside some ranges accepted? Throw all the input through some bandpass filter (not really rocket science) before parsing it and this should not be an issue any more.

    Naturally because this sort of attack was not considered and deliberately reducing function in the face of not fully known usage is seldom a good use of resources.

    However, people have been talking for quite a while how voice controlled systems should be trained to recognize authorized users and not respond to other voices. If that had actually been done, this "discovery" would have been a no-op assuming the the authorized user isn't a gerbil.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 0) by Anonymous Coward on Thursday September 07 2017, @11:26PM

    by Anonymous Coward on Thursday September 07 2017, @11:26PM (#564822)

    Yeah. Combine your comment with the AC's comment, currently directly above yours, concerning XKCD (I remembered that one too) and you quickly see just how low-tech an exploit could be mounted against this crappy half-done "design".

    Add in the Little Bobby Tables comment.
    Now, add in the bandpass filters already mentioned.

    The level of "engineering" in this concept is severely lacking.
    Did these "engineers" never before encounter the concept of "what if"?
    How about "design review"?

    -- OriginalOwner_ [soylentnews.org]