Naked Security Naked Security

Smartphone and speaker voice assistants can be hacked using lasers

A US-Japanese team published a research paper which confirms an interesting and under-estimated possibility – these devices will also accept “signal injection” commands sent to them using pulses of laser light over distances of a hundred metres or more.

As keen tech adopters know, Google Assistant, Amazon Alexa, Apple Siri, and Facebook Portal are AI-powered internet platforms that users control by issuing voice commands through smartphones or home ‘smart’ speakers.

Or at least that’s what we thought until this week when a US-Japanese team published a research paper which confirms an interesting and under-estimated possibility – these devices will also accept “signal injection” commands sent to them using pulses of laser light over distances of a hundred metres or more.

Hitherto, hacking such systems has been about sending them audible commands without their owner’s knowledge. Now the research confirms that it’s possible to achieve the same result over considerable distances in ways that might allow attackers to unlock “smartlock-protected front doors, open garage doors, shop on e-commerce websites at the target’s expense, or even locate, unlock and start various vehicles” that are connected to the victim’s Google account.

It’s a point worth remembering – voice assistants aren’t just gimmicks or conveniences and a growing volume of security-sensitive technology is now hooked up to them.

The shining

But voice-controlled devices accepting commands by way of light?

It sounds unlikely but what makes it possible is the photo-acoustic effect which has been around since 1880 when the scientist Alexander Graham Bell invented an optical communication device exploiting it.

He discovered that shining light on to an object causes it to heat up very slightly in a way that generates sound waves, which microphones, including today’s MEMS (micro-electromechanical systems) diaphragms, turn into electrical signals.

The researchers summarise this:

Thus, by modulating an electrical signal in the intensity of a light beam, attackers can trick microphones into producing electrical signals as if they are receiving genuine audio.

Engineers, including those designing voice-controlled devices, should know this. Unfortunately, Graham Bell’s discovery lost out to radio communications, and photo-acoustics were sent to the back burner.

The principle is explained in a video made by the researchers.

What’s possible

According to the researchers, as long as the light signal is carefully aimed using a telephoto lens with the correct amount of light, any MEMS-based microphone used in popular devices is vulnerable.

The distances at which communication is possible varies by device, ranging from up to 110 meters for the Google Home and Echo Plus 1st Generation to just above 20 meters for the Apple iPhone XR and sixth-generation iPad.

The equipment used to carry out the tests was a cheap five-milliwatt laser pointer, a laser driver, sound amplifier, and basic telephoto lens, together costing less than $600 combined.

An objection that voice assistant manufacturers might make is that this kind of laser attack still needs a line of sight, for example from one building to another. It’s not clear how often this would be possible under real-world conditions. The obvious mitigation is to keep these devices away from windows.

However, the researchers believe that making assumptions is the wrong way to understand vulnerabilities in this expanding class of gatekeeper devices.

Currently, the stock microphones that receive voice commands perform no authentication beyond checking wake phrases such as “OK Google” are in the owner’s voice and even this can be spoofed using voice synthesis.

The authentication problem could be mitigated in different ways – for example, by requiring that more than one microphone detect the same command simultaneously, something a laser attack would find difficult to overcome.

In the end, what matters to the large numbers of consumers buying voice assistants is that makers start taking their security more seriously by assuming the worst, rather than hoping for the best.

Leave a Reply

Your email address will not be published. Required fields are marked *