A team of academic researchers from University of Electro-Communications, Tokyo and University of Michigan have successfully demonstrated that “attackers could remotely inject inaudible and invisible commands” using laser lights into voice assistants.

  • Smart voice assistants can be hacked using laser lights, proves research.
  • Smart voice assistants rely on consumer’s voice to interact with the assigned users.
  • Once an attacker has gained access to voice assistance systems, they could use this vulnerability to attack other systems.

If you are using one of the popular voice assistance products at your home from top technology companies like Apple, Amazon and Google, this news is for you. A team of academic researchers from University of Electro-Communications, Tokyo and University of Michigan have successfully demonstrated that attackers could remotely inject inaudible and invisible commands using laser lights into voice assistants, such as Google Assistant, Amazon Alexa, Facebook Portal, and Apple Siri. Code-named LightCommands the innovation promises to inject malicious commands into several voice-controlled devices including smart speakers, tablets, and phones.

The demonstration videos released by the researchers earlier this week showed that these commands could be sent to the target devices across large distances and even into locked rooms through glass windows. Exploiting a common vulnerability in these popular devices an attacker can remotely send inaudible and potentially invisible commands which are then acted upon by these devices claim the researchers. Once an attacker has gained access to voice assistance systems, they could use this vulnerability to attack other systems. According to the researchers, this vulnerability could be used by an attacker to gain unauthorized control over your online purchases, smart home switches, smart garage doors, certain vehicles and smart locks.

How it Works

Explaining the vulnerability, a joint research paper published by researchers Takeshi Sugawara, Benjamin Cyr, Sara Rampazzi, Daniel Genkin and Kevin Fu has claimed that apart from the audio, the microphones in these devices also react to light aimed directly at them. Smart voice assistants rely on consumer’s voice to interact with the assigned users. The ‘LightCommands’ set-up uses a shining laser to approach the microphones and effectively hijack the voice assistant and send inaudible commands to the Alexa, Siri, Portal, or Google Assistant devices. Based on this principle the researchers are able to trick microphones into producing electrical signals as if they are receiving genuine audio, by modulating an electrical signal in the intensity of a light beam.

How much does it cost?

The set up comprises of commercially available products such as telephoto lens, laser driver, telescope or binocular and other equipments which are easily available in open market. The researchers estimate that all the necessary equipments required to get Light Commands up and running could be procured in less than 600 $.

The basic vulnerability cannot be solved without changes in redesign in the microphones used by these devices; however, the researchers have approached the popular manufactures Google, Amazon and Apple in order to find a possible solution to this problem.