Research finds that lasers can be used to control your smartphone, laptop or tablet through the voice command service.

A series of experiments by researchers from the University of Michigan and Tokyo-based University of Electro-Communications that was kicked off by cybersecurity expert Takeshi Sugawara have found that hackers can use lasers to target the voice command function in smart speakers, which means that your devices including smartphones, Amazon Echo speakers, Google Homes, and Facebook’s Portal video chat devices, can be manipulated by these creative methods. Their paper, “Light Commands: Laser-Based Audio Injection Attacks on Voice-Controllable Systems” revealed some strange and creative hacks that can be used to make your devices act according to the wishes of the laser-wielder.

All of this seems very fictitious; however, it turns out that not only is this doable, but it can be done with non-expensive instruments. The hacked smart speakers were used to open different types of digital locks and in some cases even helped start vehicles.

How is this even possible?

Most users of these smart-speaker devices are prone to placing these gadgets near their home doors or garage openings, which presents these attackers with a new option or opportunity to break into homes using signal-based attacks.

The microphones present on these devices convert sound into electric signals which can be edited and encoded to transmit unauthorized voice commands and “fool” these smart speakers to unlock some device or carry out any task that the hackers require.

The theory of this trick working is that the malicious laser beam is required to hit a laser beam on either the smartphone or the microphone. This remotely causes the microphone to pick up the signals which contain the mischievous signals and instruct the device to carry out any command that the hacker wishes.

Must Read: Hackers have found a very creative way to take control of smart speakers

Consumers would think that these devices are protected and can only be accessed by their voice which has been verified multiple times on multiple platforms, so it is shocking for the average consumer that their voice can be morphed or signals can be shaped to deceive the device or the processor. These processors are smart themselves, but it would be silly to believe that these intelligent devices, processors or any such entity do not have their loopholes. Clearly this is a technological conflict to see who can create a stronger algorithm.

Securing your devices

Now the next target for companies creating these smart speakers is to create algorithms, hardware, and firewalls that will be unbreakable. It is nearly impossible to create some technology that is unbreakable or fully encrypted but you can always aspire to reach a higher level of security strength.

Thankfully, the researches were the ones to discover these flaws in the security of these smart speakers. Responding to these revelations, a Google spokesperson issued a statement saying that the company was “closely reviewing this research paper. Protecting our users is paramount, and we’re always looking at ways to improve the security of our devices.” An Amazon spokesperson stated, “we are reviewing this research and continue to engage with the authors to understand more about their work.”

Hopefully, smart speaker manufacturers can look into this before these techniques fall into the wrong hands and make their systems stronger and worth the price tag consumers are paying.


Please enter your comment!
Please enter your name here