A new photoacoustic flaw in voice administration such as Siri, Alexa, and Google Abettor can render them accessible to a number of attacks that use lasers to inject aside commands into smartphones and speakers, and surreptitiously cause them to unlock doors, shop on e-commerce websites, and even start vehicles.

The attacks — dubbed Light Commands — were appear by advisers from Tokyo-based University of Electro-Communications and University of Michigan.

The novel attack works by inserting acoustic signals into microphones using laser light — from as far as 110 meters, or 360 feet — that exploits a vulnerability in MEMS (aka micro-electro-mechanical systems) microphones to accidentally acknowledge to light just as they would if it was sound.

“By modulating an electrical signal in the acuteness of a light beam, attackers can trick microphones into bearing electrical signals as if they are accepting 18-carat audio,” the advisers categorical in a paper.

However, there are no break so far that this attack has been maliciously exploited in the wild.

While the attack requires the laser beam to be in direct line of sight to the target device in question, it highlights the dangers of accidentally activating voice-controlled systems sans any form of affidavit such as a password. More troublingly, these light commands can be issued across barrio and even through closed glass windows.

MEMS microphones accommodate a small, congenital plate called the diaphragm, which when hit with sound or light waves is translated into an electrical signal, that are then decoded into the actual commands. What the advisers found was a way to encode sound by adjusting the acuteness of the laser beam, causing the microphones to aftermath electric signals in the same way as sound.

An attacker, therefore, could advantage a setup absolute of a laser pointer, a laser driver, and a sound amplifier to hijack the voice abettor and accidentally issue commands to Alexa, Siri, Portal, or Google Abettor after the victim’s intervention. To make it even more stealthy, a hacker could use an bittersweet laser, which would be airy to the naked eye.

The advisers are alive appear free what absolutely causes MEMS microphones to acknowledge to light. At this stage, they have attributed the cause to a “semantic gap amid the physics and blueprint of MEMS.”

Researchers said they tested the attack with a array of accessories that use voice assistants, including the Google Nest Cam IQ, Amazon Echo, Facebook Portal, iPhone Xr, Samsung Galaxy S9, and Google Pixel 2. But they attention that any system that uses MEMS microphones and acts on data after added user acceptance might be vulnerable.

Such crooked commands can be mitigated by adding a second layer of authentication, accepting audio input from assorted microphones, or even implementing a cover that physically blocks the light hitting the mics.

Read next: This music venue can track brain responses during performances