Skip to content
Naked Security Naked Security

Tesla 3 navigation system fooled with GPS spoofing

Cybersecurity researchers have fooled the Tesla Model 3's automatic navigation system into rapidly braking and taking a wrong turn on the highway.

Cybersecurity researchers have fooled the Tesla Model 3’s automatic navigation system into rapidly braking and taking a wrong turn on the highway.

Israeli firm Regulus Cyber spoofed signals from the Global Navigation Satellite System (GNSS), fooling the Tesla vehicle into thinking it was at the wrong location. The spoofing attack caused the car to decelerate rapidly, and created rapid lane-changing suggestions. It also made the car signal unnecessarily and try to exit the highway at the wrong place, according to the company’s report.

The GNSS is a constellation of satellites that beam location information to earthbound receivers. It’s an umbrella term for the variety of regional systems in use, such as the US GPS system, China’s BeiDou, Russia’s GLONASS, and Europe’s Galileo.

Spoofing the signals replaces them with false signals to fool receivers. Regulus used this to attack Tesla’s Navigate on Autopilot (NoA) feature.

Introduced in October 2018, NoA is the latest development in Tesla’s ongoing autonomous driving efforts. It complements the existing Enhanced Autopilot feature that enabled cars to accelerate, brake, and steer within their lane. NoA introduced the ability to change lanes for speed and navigation, effectively guiding a car on the highway portion of its journey between onramp and offramp.

NoA initially requires drivers to confirm lane changes by flicking the car’s turn indicator, but Tesla lets them waive the confirmation requirement.

In its report on the test, Regulus said:

The navigate on autopilot feature is highly dependent on GNSS reliability and spoofing resulted in multiple high-risk scenarios for the driver and car passengers.

Regulus first tested the Model S, which doesn’t support NoA. It presented those findings to Tesla and quoted its response in the report. Tesla reportedly said that drivers should be responsible for the cars at all times and prepared to override Autopilot and NoA. It also reportedly dismissed the GPS spoofing attack, pointing out that it would only raise or lower the car’s air suspension system.

Regulus then tested NoA on the Model 3. This time, it went public with its findings.

Regulus chief marketing officer Roi Mit told Naked Security:

The purpose of this kind of reporting that we do is to create cooperation. But specifically, in the case of the Tesla 3, we decided to go public on this simply because it was a car that was already available to consumers. It’s already widespread around the world.

As manufacturers inch towards truly autonomous driving systems, the danger of GNSS spoofing represents a clear and present danger to drivers, he warned.

The researchers were able to spoof GNSS signals using a local transmitter mounted on the car, to stop it affecting other cars nearby. However, broader attacks extending over several miles are possible, and Russia has systematically spoofed GNSS signals to further its own interests, according to a recent report on the dangers of GNSS attacks.

Todd E. Humphreys, co-author of that report and associate professor at the University of Texas department of aerospace engineering and engineering mechanics, described the attack as “fairly distressing” to Naked Security yesterday. He added:

The Regulus attack could be done from 10Km away as long as they had a line of sight to the vehicle.

Does this mean nightmare scenarios in which a terrorist could drive all the Teslas in a 10km radius off the road into concrete bollards, killing their owners? It’s unlikely. For one thing, NoA combines vehicle sensors with the GNSS data, and would only turn off the highway if it spotted road markings that indicated an exit. The Regulus attack was only able to force an exit from the highway onto an alternative ‘pit stop’ with exit markings.

A more realistic attack would involve misdirecting the Tesla’s navigation system to take the left-and-right turns it expected, but at unintended junctions, Humphreys told us.

The Tesla vehicle would have enough visual and radar smarts that it wouldn’t drive off the road into a tree. But you could definitely cause the vehicle to end up in a neighourhood where the driver was not safe.

In fact, researchers at Virginia Tech recently covered this scenario in a paper entitled All Your GPS Are Belong To Us.

Tesla did not respond to our requests for comment yesterday.

6 Comments

As the driver, you are responsible to monitor the car and be prepared to override it at any time. In fact, none of Tesla’s autopilot features are enabled by default. As an owner, you have to go into several settings menus and turn them each on individually, with human-readable disclaimer text popping up for several that you have to agree to.

Furthermore, spoofing GPS would cause human drivers relying on GPS to make bad decisions. You could also put up fake road signs, or any number of shenanigans to spoof machines and humans alike.

Reply

Time: A Lot of Cars Got Stuck in the Mud After Following Google Maps on a Detour Due to Unforeseen Circumstances

[URL redacted]

Reply

Autonomous driving won’t happen, so it’s a moot point. Not every road has perfect, up to date, clear road markings like a highway. Some roads have no markings at all. Some roads aren’t even paved.

Reply

Sure it will. If the computer between your ears can process how to drive on those types of roads, computers can be taught that too. It will take awhile for that to happen, but it will.

Reply

In future news: “Hacker pranksters gimmick transporter circuits and send unwitting victims to remote countries during beaming.” It’ll happen, of course. Don’t believe me? After all, the urban legend of the Stella Awards and the Winnebago whose “cruise control” https://www.snopes.com/fact-check/cruise-uncontrol/ failed to drive it safely, became reality with the 100 drivers who followed their satnav down a muddy road through a farmer’s field. https://nakedsecurity.sophos.com/2019/06/28/google-maps-shortcut-turns-into-100-car-mud-pie-in-farmers-field
Possibly the best use for a self-driving car would be for it to, when detecting an intoxicated driver, to have it pull over, park, and automatically call in for police assistance. After all, it wouldn’t be the first time a car turned snitch on its owner: https://www.theguardian.com/technology/2015/dec/07/florida-woman-arrested-hit-and-run-car-calls-police
Dang it, I love these stories!

Reply

Tesla has a long track record of blaming drivers for accidents with their self driving cars. They tell drivers to keep hands on the wheel so they can intervene but also tell them not to intervene as the car drives itself. Who is in control and liable for an accident?

I’d rather drive myself thanks.

Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?
You’re now subscribed!