Cybersecurity researchers have fooled the Tesla Model 3’s automatic navigation system into rapidly braking and taking a wrong turn on the highway.
Israeli firm Regulus Cyber spoofed signals from the Global Navigation Satellite System (GNSS), fooling the Tesla vehicle into thinking it was at the wrong location. The spoofing attack caused the car to decelerate rapidly, and created rapid lane-changing suggestions. It also made the car signal unnecessarily and try to exit the highway at the wrong place, according to the company’s report.
The GNSS is a constellation of satellites that beam location information to earthbound receivers. It’s an umbrella term for the variety of regional systems in use, such as the US GPS system, China’s BeiDou, Russia’s GLONASS, and Europe’s Galileo.
Spoofing the signals replaces them with false signals to fool receivers. Regulus used this to attack Tesla’s Navigate on Autopilot (NoA) feature.
Introduced in October 2018, NoA is the latest development in Tesla’s ongoing autonomous driving efforts. It complements the existing Enhanced Autopilot feature that enabled cars to accelerate, brake, and steer within their lane. NoA introduced the ability to change lanes for speed and navigation, effectively guiding a car on the highway portion of its journey between onramp and offramp.
NoA initially requires drivers to confirm lane changes by flicking the car’s turn indicator, but Tesla lets them waive the confirmation requirement.
In its report on the test, Regulus said:
The navigate on autopilot feature is highly dependent on GNSS reliability and spoofing resulted in multiple high-risk scenarios for the driver and car passengers.
Regulus first tested the Model S, which doesn’t support NoA. It presented those findings to Tesla and quoted its response in the report. Tesla reportedly said that drivers should be responsible for the cars at all times and prepared to override Autopilot and NoA. It also reportedly dismissed the GPS spoofing attack, pointing out that it would only raise or lower the car’s air suspension system.
Regulus then tested NoA on the Model 3. This time, it went public with its findings.
Regulus chief marketing officer Roi Mit told Naked Security:
The purpose of this kind of reporting that we do is to create cooperation. But specifically, in the case of the Tesla 3, we decided to go public on this simply because it was a car that was already available to consumers. It’s already widespread around the world.
As manufacturers inch towards truly autonomous driving systems, the danger of GNSS spoofing represents a clear and present danger to drivers, he warned.
The researchers were able to spoof GNSS signals using a local transmitter mounted on the car, to stop it affecting other cars nearby. However, broader attacks extending over several miles are possible, and Russia has systematically spoofed GNSS signals to further its own interests, according to a recent report on the dangers of GNSS attacks.
Todd E. Humphreys, co-author of that report and associate professor at the University of Texas department of aerospace engineering and engineering mechanics, described the attack as “fairly distressing” to Naked Security yesterday. He added:
The Regulus attack could be done from 10Km away as long as they had a line of sight to the vehicle.
Does this mean nightmare scenarios in which a terrorist could drive all the Teslas in a 10km radius off the road into concrete bollards, killing their owners? It’s unlikely. For one thing, NoA combines vehicle sensors with the GNSS data, and would only turn off the highway if it spotted road markings that indicated an exit. The Regulus attack was only able to force an exit from the highway onto an alternative ‘pit stop’ with exit markings.
A more realistic attack would involve misdirecting the Tesla’s navigation system to take the left-and-right turns it expected, but at unintended junctions, Humphreys told us.
The Tesla vehicle would have enough visual and radar smarts that it wouldn’t drive off the road into a tree. But you could definitely cause the vehicle to end up in a neighourhood where the driver was not safe.
In fact, researchers at Virginia Tech recently covered this scenario in a paper entitled All Your GPS Are Belong To Us.
Tesla did not respond to our requests for comment yesterday.