Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Friday November 08 2019, @03:32AM   Printer-friendly
from the hear-the-lights-dude dept.

Submitted via IRC for soylent_red

Using Light Beams to Control Google, Apple, Amazon Assistants

Academic researchers found that certain microphones convert light to sound, allowing voice commands to be sent to voice-controlled (VC) devices like Google Home, Amazon Echo, Facebook Portal, smartphones, or tablets.

Dubbed Light Commands, the attack works from afar by shining a laser beam at microphones that use micro-electro-mechanical systems (MEMS), which convert the light into an electrical signal.

By modulating the intensity of the light beam, MEMS can be tricked to produce the same electrical signals produced by audio commands. With careful aiming and laser focusing, attacks can be successful from as far as 110 meters.

In their experiments, researchers from the University of Electro-Communications in Japan and the University of Michigan tested the attack on popular VC devices.

The voice recognition system in Google Home, Nest Cam, Amazon Echo, Fire Cube TV, iPhone, Samsung Galaxy S9, Google Pixel, and iPad, was tested from various distances.

A Light Commands attack sends inaudible instructions to a voice-controlled device, making it react in a meaningful way. The researchers demonstrated that it can be used to open a garage door or to unlock the front door of a house.

No large investment is needed to pull this off, either. A low-cost setup used by the researchers consisted of a normal laser pointer, a Wavelength Electronics laser driver ($339), and a Neoteck NTK059 sound amplifier ($27.99). A computer that plays the recorded audio commands is also required. Laser beams provide precise aiming, but the researchers showed that Light Commands attacks also work with a laser flashlight (Acebeam W30). From 10 meters, they were able to inject commands into Google Home.


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2) by takyon on Friday November 08 2019, @03:48AM (2 children)

    by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Friday November 08 2019, @03:48AM (#917729) Journal

    Lackluster outcome.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 0) by Anonymous Coward on Friday November 08 2019, @04:31AM (1 child)

      by Anonymous Coward on Friday November 08 2019, @04:31AM (#917753)

      Not so lackluster outcome:

      Alexa, call 911, tell them we have a shooter in the house.

  • (Score: 2) by NotSanguine on Friday November 08 2019, @04:16AM (4 children)

    by NotSanguine (285) <NotSanguineNO@SPAMSoylentNews.Org> on Friday November 08 2019, @04:16AM (#917748) Homepage Journal

    Beam weapons seem like an expensive way to "control" such devices. A hammer, tire iron or shotgun can do just as good a job, IMHO.

    --
    No, no, you're not thinking; you're just being logical. --Niels Bohr
    • (Score: 2) by c0lo on Friday November 08 2019, @04:45AM (1 child)

      by c0lo (156) Subscriber Badge on Friday November 08 2019, @04:45AM (#917762) Journal

      Beam weapons seem like an expensive way to "control" such devices.

      On the upside, disabling the alarm and make an inconspicuous entry to access the loot - aka former property of the idiot that uses IoT controlled via Google assistant/Alexa? On a budget under $500?
      I'd say the thieves guild should lower the professional indemnity insurance premium for any of their members that use this equipment (instead of hammer, tire iron or shotgun).

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
      • (Score: 5, Funny) by NotSanguine on Friday November 08 2019, @04:50AM

        by NotSanguine (285) <NotSanguineNO@SPAMSoylentNews.Org> on Friday November 08 2019, @04:50AM (#917764) Homepage Journal

        On the upside, disabling the alarm and make an inconspicuous entry to access the loot - aka former property of the idiot that uses IoT controlled via Google assistant/Alexa? On a budget under $500?
        I'd say the thieves guild should lower the professional indemnity insurance premium for any of their members that use this equipment (instead of hammer, tire iron or shotgun).

        Thieves? I'm talking about how the owners of these devices should "control" them.

        Oh, is this about stealing? Sorry. I didn't read TFS, just the headline.

        --
        No, no, you're not thinking; you're just being logical. --Niels Bohr
    • (Score: 1, Informative) by Anonymous Coward on Friday November 08 2019, @02:10PM (1 child)

      by Anonymous Coward on Friday November 08 2019, @02:10PM (#917855)

      The advantage this has over a hammer, tire iron, shotgun, or even just an audio player is that you don't have to enter the house - the telescreen just needs to be visible from a window. So no, for some applications, it would be a lot better.

      If you combine it with some kind of speech synthesis software, you could theoretically cause it to do anything that the owner could do without the owner being in the house. You could certainly use it to order your neighbor 100 pizzas, but I'm sure the espionage folks will come up with better uses. It's not unrealistic to imagine a court case where the "evidence" is a recording of someone committing murder, only that entire recording was fabricated and beamed into the telescreen from a black van on the street (or possibly a small drone, if it was sitting on the windowsill).

      • (Score: 2) by NotSanguine on Friday November 08 2019, @02:21PM

        by NotSanguine (285) <NotSanguineNO@SPAMSoylentNews.Org> on Friday November 08 2019, @02:21PM (#917856) Homepage Journal

        The advantage this has over a hammer, tire iron, shotgun...is that you don't have to enter the house

        Why wouldn't I want to enter my own house?

        And I'll use whatever tools I want to control my own stuff. Not that I would ever buy such a device, except to smash it with a hammer or a tire iron, or give it both barrels.

        --
        No, no, you're not thinking; you're just being logical. --Niels Bohr
  • (Score: 0) by Anonymous Coward on Friday November 08 2019, @04:10PM

    by Anonymous Coward on Friday November 08 2019, @04:10PM (#917915)

    Besides commercials on TV/radio saying Alexa.
    A Harley Davidson with loud exhaust drove past my house and set off Alexa.
    Saying only "Lexa" sets it off.
    Any more to add?

  • (Score: 3, Informative) by DannyB on Friday November 08 2019, @04:50PM

    by DannyB (5839) Subscriber Badge on Friday November 08 2019, @04:50PM (#917934) Journal

    A couple years ago (sorry no link) there was a different attack on voice controlled personal assistants.

    The voice can be higher pitched. Even higher pitched. Even such a high pitched voice it is ultrasonic -- but Alexa / Siri / etc don't mind!

    There was another attack where it was possible to take a recorded voice command and manipulate it to be recognized as a different command.

    What you think you heard: "Alexa, what's the weather?"
    What Alexa heard: "Alexa, surf to evil.com"

    This manipulation was done based on how the neural network worked. There was a paper on it. I don't have it anymore. I think I saw it on news.ycombinator.com.

    Imagine a combination of the attacks.
    1. It's a laser beam pointed at the mic
    2. Sending an ultrasonic voice command
    3. The command sounds to a human like a different command than what Alexa interprets it to be

    --
    People today are educated enough to repeat what they are taught but not to question what they are taught.
(1)