How To Protect Your Smart Speaker From A Laser Attack

Yolanda Curtis
November 6, 2019

Researchers in Japan and at the University of MI said Monday they had found a way to take over Google Home, Amazon's Alexa or Apple's Siri devices from hundreds of feet away by shining laser pointers, and even flashlights, at the devices' microphones.

The team found microphones in some of the most popular smart speakers and smartphones on the market interpreted the bright light of the laser as sound.

While Google Home is quickly expanding its library of commands, the reality is that Alexa remains the most robust digital assistant when it comes to smart speakers.

John Lewis can be one other model to maintain in your radar as they have a tendency to drop the costs on their TVs for Black Friday that means you usually tend to get a proposal on a 4K TV. They were able to open a garage door by pointing their laser at the home device, for example.


A team of cybersecurity researchers has discovered a clever technique to remotely inject inaudible and invisible commands into voice-controlled devices - all just by shining a laser at the targeted device instead of using spoken words.

By matching the intensity of the laser to match the frequency of a human voice pitch, the lasers could command the devices.

"If you have a laser that can shine through windows and across long distances - without even alerting anyone in the house that you're hitting the smart speaker - there's a big threat in being able to do things a smart speaker can do without permission of the owner", said Benjamin Cyr, a graduate student at the University of MI and a paper coauthor.

Researchers spent seven months testing the trick on 17 voice-controlled devices enabled with Alexa, Siri, Facebook Portal and Google Assistant, including Google Home, Echo Dot, Fire Cube, Google Pixel, Samsung Galaxy, iPhone and iPad.


The speakers responded to the light as if it was voice-based sound waves, which raises serious security concerns-especially since most of these devices do not require user verification in order to be used, at least not by default. That paper is available through a new website centered on explaining these so-called Light Commands, which essentially use lasers to manipulate smart speakers with bogus commands. This can be the case for many smartphone users instead of others who obviously would not disable such functionality in the dedicated voice assistant devices they purchase such as Alexa.

The attack, dubbed LightCommands, works because the diaphragm in microphones converts sound into electrical signals.

The range for the laser-based smart device zappery, er, ranged from 16 feet to some 250 feet away, with a laser being fired through a window of an opposite building.

From there, the attacker could buy things on Amazon or Google, or worse, open the garage door. They even could have remotely unlocked or started a auto that was connected to the device.


The researchers noted that they haven't seen this security issue being taken advantage of.

Other reports by iNewsToday

FOLLOW OUR NEWSPAPER