Skip to content
Naked Security Naked Security

Your voice assistant can hear things you can’t – such as a hacker

What researchers have dubbed the Dolphin Attack is theoretical - for now - but it affects all voice assistants

Word from Apple, ahead of the big rollout of iPhone 8 and iOS11 on September 12, is that its voice assistant Siri is going to sound more like a person and less like a robot.

Great for the user experience. But based on a report published just last week by a team of researchers at Zhejiang University in China, perhaps Apple should have spent more of its time on what Siri hears instead of what users hear.

Because they demonstrated that Siri – along with every other voice assistant (VA) they tested – will respond to commands that don’t come from a human – that are not only outside the human vocal range, but are also inaudible to humans.

Which means your dog could probably hear it. But it also means an attacker could give your VA a command, and you won’t know about it.

In the report, titled, “Dolphin Attack: Inaudible Voice Commands”, the researchers said they were able to validate it on Siri, Google Now, Samsung S Voice, Huawei HiVoice, Cortana and Amazon’s Alexa. Using ultrasonic voice commands at frequencies of more than 20 kHz, they got the VAs to

  • Visit a malicious website, “which can launch a drive-by-download attack or exploit a device with 0-day lnerabilities”.
  • Spy on the user by initiating outgoing video/phone calls, therefore getting access to the image/sound of device surroundings.
  • Inject fake information, by instructing the device, “to send fake text messages and emails, to publish fake online posts, to add fake events to a calendar, etc”.
  • Impose a denial of service, through a command to turn on the airplane mode, disconnecting all wireless communications.
  • Conceal attacks by dimming the screen and lowering the volume of the device.
  • “Tested attacks include launching Facetime on iPhones, playing music on an Amazon Echo and manipulating the navigation system in an Audi automobile,” the team wrote, which means an attacker could change the destination on your GPS.

There are limits – significant limits – on the capability of launching an attack. It can’t be done remotely, from miles away like the famous 2015 hack of a Jeep Cherokee by Charlie Miller and Chris Valasek. While it only takes about $3 worth of hardware added to a smartphone, it would require being anywhere from a few feet to inches from a potential victim. So an attacker likely can’t tell Alexa to unlock your back door if he’s not already in your house.

In a public place, however – a crowded subway for one – it wouldn’t be difficult to get very close to other devices.

But another barrier is that on smartphones, the screen would have to be already unlocked for most ultrasound commands to work. Siri will make a phone call to somebody in a user’s contact list without the screen being unlocked, but it won’t do most sensitive things like open a website, open third-party apps, conduct a financial transaction or send text messages.

Obviously, that barrier is removed if somebody is doing something on their phone, since that means they’ve unlocked it, but if Siri got a surreptitious, ultrasound command from an attacker, the intended victim would likely be looking at the phone, and see that something unusual was happening.

The researchers did offer suggestions to defend against DolphinAttack, including modifying the microphone so it doesn’t respond to anything outside of the human vocal range.

But some experts say voice recognition software actually needs those inaudible higher frequencies to analyze what a person is saying – that they are a part of human speech that is inaudible to the human ear but not to a computer. Gadi Amit, founder of NewDealDesign, told Fast Code Design that making VA technology ignore ultrasound frequencies might cause “a negative effect that lowers the comprehension score of the whole system”.

Then again, if Apple and others are able to make their VAs sound more like a human, perhaps they can configure them to recognize when a command is coming from something other than a human.

Or, a security conscious user with an iPhone could simply go to Settings and disallow the “Hey Siri” option, which would then require pressing the Home button to issue a command.

Apple declined to comment on the report, while Amazon said in response:

We take privacy and security very seriously at Amazon and are reviewing the paper issued by the researchers.

For now, it appears this is more of a demonstration of potential risk than an imminent security disaster – the researchers are due to present their paper in a couple of months at the ACM Conference on Computer and Communications Security. But, as is the case with just about every kind of hacking technology, it is likely to improve. So it would be wise for VA developers and designers to get out ahead of that potentially malicious progress with some security progress of their own.


2 Comments

So someone could exfiltrate data on a “secure” computer with USB and disks disabled and limited networking by using a shim to modulate a 20 KHz carrier and singing it to Alexa?

Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?
You’re now subscribed!