Two troublesome words tucked into proposed US legislation related to cybersecurity for cars and trucks could have some unintended consequences for vehicle security if it ever becomes the law of the land.
Car hacking has attracted increased scrutiny from lawmakers in recent months, after researchers disclosed security vulnerabilities in several Fiat Chrysler models, causing the car maker to recall 1.4 million vehicles for a security patch.
The intent of the legislation, under consideration in the US House of Representatives, is to prevent malicious attackers from accessing the onboard computers in vehicles in such a way that threatens drivers’ safety and privacy.
But some watchdogs worry that two words in the proposed law – “without authorization” – could also make it illegal for researchers to examine code in automotive computers.
The proposed language would prohibit anyone from accessing a vehicle’s electronic control unit or critical system without authorization, under penalty of a $100,000 civil fine – but authorization from whom is unclear.
It shall be unlawful for any person to access, without authorization, an electronic control unit or critical system of a motor vehicle, or other system containing driving data for such motor vehicle, either wirelessly or through a wired connection.
That ambiguity, say consumer advocates such as the US Federal Trade Commission (FTC) and the Electronic Frontier Foundation (EFF), could actually have consequences for security research that would make cars less safe.
If security researchers are barred from finding security vulnerabilities in cars, the argument goes, it’s possible those vulnerabilities could be exploited by someone who actually wants to do harm.
This same rationale – more researchers tinkering makes software more secure – is what motivates companies in many industries to sponsor bug bounty programs that reward researchers for responsibly disclosing security holes.
Responsible disclosure is key to a mutually beneficial relationship between security researchers and vendors, but there’s a big gray area between sponsored security research and malicious hacking.
Car hackers Charlie Miller and Chris Valasek – who demonstrated how they could exploit a Jeep Cherokee’s connected entertainment system to hijack the vehicle’s steering, accelerator and brakes – worked with Fiat Chrysler to patch the security hole.
The world’s biggest auto manufacturer, General Motors, is reportedly launching a bug bounty program, and Tesla Motors has its own bug bounty program inviting hackers to take a look inside its connected cars.
In respect to the recently proposed car hacking legislation, skeptics say that researchers would be less likely to hunt down bugs in cars for fear of being fined, or worse.
If the car hacking legislation were to become law, it could do damage to consumer privacy as well, according to the FTC.
During testimony this week before the US House Energy and Commerce Committee, Maneesha Mithal, associate director of the FTC’s Division of Privacy and Identity Protection, told lawmakers that several parts of the bill are troubling.
Language in the bill that would give a safe harbor to manufacturers who submit privacy policies to the Department of Transportation is too broad, Mithal said – and could end up shielding carmakers from enforcement of consumer protection laws, even if those companies violate their own privacy policies.
Overall, the legislation could “substantially weaken the security and privacy protections that consumers have today,” Mithal said.
The FTC testimony pointed to another part of the bill that could make the development of automobile cybersecurity standards captive to industry representatives.
Now, this is only a draft bill, and it’s unlikely to become law in its present form.
Other car cybersecurity legislation under consideration in the US Senate, sponsored by Senator Edward Markey, is much better with respect to privacy, according to the privacy watchdog EFF.
On the other hand, the EFF says, the “without authorization” language in the US House committee’s proposal brings to mind some “bad laws” already on the books that raise similar concerns.
The EFF is currently asking the US Copyright Office to exempt car owners from provisions in the Digital Millennium Copyright Act (DMCA) that restrict a car owner’s ability to inspect or modify code in their own vehicle.
Car manufacturers, meanwhile, have argued in comments to the Copyright Office that allowing anyone to inspect code without permission would help malicious hackers and competitors.
These laws have some good intentions – to protect people and companies from cybercriminals who could steal their data or intellectual property, and, presumably, to make it illegal for someone to tweak the code in your own daily driver without asking you first.
But if these laws are discouraging security researchers from doing the kind of testing and reverse engineering that could improve security and safety for all, it’s probably time to find the bugs in those laws – and fix them.
Image of miniature car mechanics courtesy of Shutterstock.com.
Sammie
Time to get one of those classic chevy, wonder if going to the past generation of gadgets is the better way to stay safe and safeguard ones privacy.
4caster
Makes a change from the manufacturer falsifying the software.
Al
So, if it was an offence to try and check the software, simply means that all the manufacturers could get away with whatever software hacks they wanted to cheat tests with impunity
Rico Robbins
Okay, but consider this: Why should COPYRIGHT law be used to make a form of SECURITY research illegal? I know that the DMCA, specifically section 1201, makes it illegal to bypass TPMs, but in all honesty, is that even necessary? It brings copyright into areas where it shouldn’t be an issue. Security research in cars isn’t the only issue with this part of the law. For instance, Apple uses both its EULA and the DMCA to try to make jailbreaking an iOS device illegal. Whether or not someone jailbreaks an iPhone or whether or not someone finds a bug in a car’s copyrighted software that poses a security vulnerability does not harm either market for the copyrighted work, and could even be considered fair use. Copyright is only designed to “promote the progress of science and the useful arts” through this limited monopoly, and shouldn’t be used to enforce a corporation’s end user policy. Furthermore, it makes noninfringing activities illegal (such as bypassing DRM to make a fair use of video content), while being a separate penalty for copyright infringement. I’d argue it would be best for both content creators, remixers, and security researchers alike for section 1201 of the DMCA to be tossed out for good.
Mahhn
So would it now be a crime to hack your own car? you own the hardware but is the software copy written and therefore untouchable? Speed shops that do performance tuning could be considered criminals. Politicians are such bubble people.
Paul Ducklin
Unless the “without authorisation” ends up meaning the same sort of thing it does in the Computer Misuse Act – I can login to my own computer, and I can give you permission to do so, too. But if I don’t give you permission…even if you just take a look, you’re in the wrong.
Dave
The solution to this is simple: vehicles that have carburetors and NO black boxes. I’m headed in that direction.
Paul Ducklin
Get a bicycle. No rego, no number plates, no licence, no parking hassles, no traffic queues, no petrol to buy and…
…OK, no motor.
Anonymous
Will this law be able to stop black hat stop hackers (that don’t follow laws)? no.
Will secure coding practices stop those same black hat hackers? possibly.
Will this law stop white hat security researchers interested in helping with make sure these systems have secure coding practices? yes.