Lasers can seemingly hack Alexa, Google Home and Siri

Yolanda Curtis
November 5, 2019

But whether or not the NSA is eavesdropping while you yell at Alex Trebek may be just the beginning of our problems, as researchers in Japan and the University of MI have figured out how to hack Google Home, Amazon Alex, and Apple Siri devices "from hundreds of feet away" using laser pointers and flashlights.

Researchers suggest smart speaker makers can fix this vulnerability by adding a light shield around the microphone or using two different microphones on opposite sides to listen to voice commands. They showed how a hacker can use the light-injected voice commands to unlock smart-lock protected doors, garage doors, allow them to shop online without the target's knowledge - and even locate and unlock the target's vehicle - if these are all connected to the voice controllable systems.

Criminals could also use the hack to open auto doors, change the settings on a thermostat or make purchases online, according to the research from Japan and the University of MI. This remotely causes the microphone to pick up electrical signals representing the attacker's commands.


"By modulating an electrical signal in the intensity of a light beam, attackers can trick microphones into producing electrical signals as if they are receiving genuine audio", the researchers said in their paper [PDF]. The researchers said that distance was the longest area they could use (a hallway) when conducting tests.

Researchers in Japan and at the University of MI have been studying the issue for seven months, The New York Times explains. A targeted device must also be directly in line of sight so that the laser can be aimed at the specific part of the microphone on the device. For one, the laser needs to hit the microphone at just the right angle, and the mics are rarely in a convenient location. It requires specialised equipment, like a laser pointer, laser driver, sound amplifier, a telephoto lens, and more. Light-based command injection may change the equation.

Usually you have to talk to voice assistants to get them to do what you want.


Researchers said the weakness can't truly be fixed without redesigning the microphones, known as MEMS microphones, that are built into these devices, however, which would be a lot more complicated.

It's not just Google Home and Amazon Echo that are susceptible to this hack.

Smart speakers like Google Home (Nest), Apple HomePod, and Amazon Echo are constantly listening using local audio processing, but they only "wake up" when someone says the trigger phrase. Also, an attack would only work on unattended smart speakers as an owner could notice a light beam reflecting on a smart speaker. For example, the voice-controlled system could ask the user a simple randomized question before executing a command.


While this is a troubling development for fans of smart home technology, Light Command isn't going to make all your smart speakers and displays easily hackable. Attempting to hack phones and tablets from the distance, using laser beams, seems the more risky side-effect of the hack, and the kind of attack you'd see in spy movies.

Other reports by iNewsToday

FOLLOW OUR NEWSPAPER