Stories
Slash Boxes
Comments

SoylentNews is people

Submission Preview

Link to Story

Using Light Beams to Control Google, Apple, Amazon Assistants

Accepted submission by upstart at 2019-11-06 17:06:26
/dev/random

████ Bot sub. Needs editing. Et cetera. ████

Submitted via IRC for soylent_red

Using Light Beams to Control Google, Apple, Amazon Assistants [bleepingcomputer.com]

Academic researchers found that certain microphones convert light to sound, allowing voice commands to be sent to voice-controlled (VC) devices like Google Home, Amazon Echo, Facebook Portal, smartphones, or tablets.

Dubbed Light Commands, the attack works from afar by shining a laser beam at microphones that use micro-electro-mechanical systems (MEMS), which convert the light into an electrical signal.

By modulating the intensity of the light beam, MEMS can be tricked to produce the same electrical signals produced by audio commands. With careful aiming and laser focusing, attacks can be successful from as far as 110 meters.

Long-range attack

In their experiments, researchers from the University of Electro-Communications in Japan and the University of Michigan tested the attack on popular VC devices.

The voice recognition system in Google Home, Nest Cam, Amazon Echo, Fire Cube TV, iPhone, Samsung Galaxy S9, Google Pixel, and iPad, was tested from various distances.

DeviceVoice Recognition
System
Minimun Laser Power
at 30 cm [mW]
Max Distance
at 60 mW [m]*
Max Distance
at 5 mW [m]**
Google HomeGoogle Assistant0.550+110+Google Home miniGoogle Assistant1620-Google NEST Cam IQGoogle Assistant950+-Echo Plus 1st GenerationAmazon Alexa2.450+110+Echo Plus 2nd GenerationAmazon Alexa2.950+50EchoAmazon Alexa2550+-Echo Dot 2nd GenerationAmazon Alexa750+-Echo Dot 3rd GenerationAmazon Alexa950+-Echo Show 5Amazon Alexa1750+-Echo SpotAmazon Alexa2950+-Facebook Portal MiniAlexa + Portal185-Fire Cube TVAmazon Alexa1320-EchoBee 4Amazon Alexa1.750+70iPhone XRSiri2110-iPad 6th GenSiri2720-Samsung Galaxy S9Google Assistant605-Google Pixel 2Google Assistant465-

A Light Commands attack sends inaudible instructions to a voice-controlled device, making it react in a meaningful way. The researchers demonstrated that it can be used to open a garage door or to unlock the front door of a house.

No large investment is needed to pull this off, either. A low-cost setup used by the researchers consisted of a normal laser pointer, a Wavelength Electronics laser driver ($339), and a Neoteck NTK059 sound amplifier ($27.99). A computer that plays the recorded audio commands is also required.

Laser beams provide precise aiming, but the researchers showed that Light Commands attacks also work with a laser flashlight (Acebeam W30). From 10 meters, they were able to inject commands into Google Home.

As seen in the image above, the light covers the target device completely. This imprecise aiming, though, has its downsides: limited distance and potentially hitting microphones from other devices.

For long-range attacks, additional gear is required to focus the beam on the right spot: a telescope, a telephoto lens, and a tripod for focus and accurate aiming.

Windows are not an obstacle as long as there is a direct line of sight between the source of the light and the target device.

Despite the double-pane glass window and windy conditions, the experiment was successful. Reflections were negligible, the researchers write in a paper describing the details for Light Commands injection attacks.

To run the experiments, four commands were recorded for asking the time, setting the volume to zero, placing an order for a laser pointer, and opening a garage door. To these the predefined device wake up phrase ("OK Google," "Hey Siri," "Alexa," "Hey Portal") was appended.

Real-life limitations

Although a novel type of attack, it is hard to imagine a successful Light Commands attack outside the preset conditions of an experiment. Clearly, there is no reason for concern at the moment.

A threat actor has to consider limitations such as line of sight to the device as well as the barriers in the way as light has trouble going through an opaque environment, such as fog or tainted windows.

Furthermore, the victim may be alerted by the visibility of the light beam, unless infrared is used - but additional gear is necessary in this case, and the audio response fom the target device confirm execution of the command.

The target device may also represent a problem. A smart speaker at the window is an easier target than a smartphone or a tablet, which are designed for mobility and their owner could place them in a position that does not allow a direct line to their microphone.

The Light Commands research is the result of Takeshi Sugawara (University of Electro-Communications in Japan), Benjamin Cyr, Sara Rampazzi, Daniel Genkin, and Kevin Fu (University of Michigan). Details are provided in their paper called "Light Commands: Laser-Based Audio Injection Attacks on Voice-Controllable Systems" (PDF [lightcommands.com]). A website [lightcommands.com] has also been set up for an overview of this type of attacks.


Original Submission