A team of cybersecurity researchers has discovered a clever technique to remotely inject inaudible and invisible commands into voice-controlled devices — all just by shining a laser at the targeted device instead of using spoken words.
Dubbed ‘Light Commands,’ the hack relies on a vulnerability in MEMS microphones embedded in widely-used popular voice-controllable systems that unintentionally respond to light as if it were sound.
According to experiments done by a team of researchers from Japanese and Michigan Universities, a remote attacker standing at a distance of several meters away from a device can covertly trigger the attack by simply modulating the amplitude of laser light to produce an acoustic pressure wave.
“By modulating an electrical signal in the intensity of a light beam, attackers can trick microphones into producing electrical signals as if they are receiving genuine audio,” the researchers said in their paper [PDF].
Doesn’t this sound creepy? Now read this part carefully…
Smart voice assistants in your phones, tablets, and other smart devices, such as Google Home and Nest Cam IQ, Amazon Alexa and Echo, Facebook Portal, Apple Siri devices, are all vulnerable…