How many tiny accelerometers do you depend on? There’s the one in your smartphone, telling it which way’s up, so it can adjust the screen horizontally or vertically (or track your footsteps or how fast you’re running, or for that matter, transform your iPhone into a seismometer).
For similar reasons, there’s one in your FitBit-type contraption too. Then there are devices like Microsoft’s Kinect and Nintendo’s Wii which use them to help track motion. And that’s not all. You can find them in toy remote control cars (and real cars, which use them to detect rapid deceleration and trigger your airbag) and even medical devices – where they might soon help control when and how much medicine you get.
The answer is probably more than you think. Oh… and they’re hackable. A team of computer scientists and engineers from the University of Michigan and University of South Carolina have done it using nothing but carefully tweaked audio.
Similar “resonant acoustic injection attacks” have already been used to disable related nano-devices, such as gyroscopes. Now, these scientists have gone a step further, spoofing MEMS accelerometers with “intentional acoustic interference” that takes “fine-grained control” of the device, and generates specific spurious values.
Since most current systems blindly trust accelerometer data as gospel truth to be acted upon accordingly, that’s not good.
As reported by Michigan Engineering, Professor Kevin Fu’s team:
…used precisely tuned acoustic tones to deceive 15 different models of accelerometers into registering movement that never occurred.
This works because the heart of a MEMS accelerometer is analog: “a mass suspended on springs” that moves when the object it’s embedded in changes speed or direction. Such movements, however, can also be created by sound waves, even if the bigger object hasn’t budged. And that opens “a backdoor into the devices – enabling the researchers to control other aspects of the system”.
75% of the accelerometers Fu’s team tested were vulnerable to the first type of attack they conjured up; 65% proved vulnerable to the second. In these cases, as Fu put it:
The fundamental physics of the hardware allowed us to trick sensors into delivering a false reality to the microprocessor.
Having fooled the accelerometer, they next showed how their exploit could change real systems’ physical behavior. For instance, they coaxed a Fitbit into generating 3,000 steps per hour that its owner never walked.
You laugh, but that exploit could have been used to cash in on rewards programs designed to incentivize exercise to promote better health. And what if we told you they also coaxed a Samsung Galaxy S5 accelerometer into generating the word “WALNUT” in a graph of its readings?
Still laughing? Yeah, OK, us too, a bit. Here’s one that’s a bit spookier: they tricked a smartphone into playing a music file that in turn hijacked its own accelerometer’s output, and used the fake output to control and drive a toy car via an app running on the same smartphone.
As is so often the case, the principles matter more than the proofs-of-concept, however goofy or spooky they may be: analog devices are vulnerable and require cybersecurity attention. Cyber-physical systems bring unique security challenges. And, just perhaps, more systems should take precautions before instantly acting on their sensory inputs.
Principles like these helped earn this research an official Alert (ICS-ALERT-17-073-01) from the US Department of Homeland Security’s Industrial Control Systems Cyber Emergency Response Team.
Meanwhile, those tiny accelerometers are out there by the billions. You couldn’t exactly recall or replace them all. Fortunately, the paper’s authors have proposed two “low-cost software defenses” that could be implemented in firmware, for systems that can be found and flashed.
They’ve also made suggestions for tightening up hardware design going forward – and, of course, notified the affected accelerometer manufacturers.