Sophos News

Police body cameras open to attack

Police officers in the US often wear body cameras to protect themselves and reduce complaints from the public. Now, though, a security researcher has revealed that these cameras could put evidence – and even police officers themselves – at risk.
Josh Mitchell, a consultant at security firm Nuix, analysed cameras from five vendors who sell them to US law enforcement agencies. Presenting at the DEF CON conference last week, he highlighted vulnerabilities in several popular brands that could place an attacker in control of a body camera and tamper with its video.
Attackers could access cameras in several ways, Mitchell said. Many of them include Wi-Fi radios that broadcast unencrypted sensitive information about the device. This enables an attacker with a high-powered directional antenna to snoop on devices and gather information including their make, model, and unique ID. An attacker could use this information to track a police officer’s location and find out more about the device that they are using. They might even be able to tell when several police officers are coordinating a raid, he said.
Mitchell’s research found that some devices also include their own Wi-Fi access points but don’t secure them properly. An intruder could connect to one of these devices, view its files and even download them, he warned. In many cases, the cameras relied on default login credentials that an attacker could easily bypass.

Evidence tampering

One potential attack involves tampering with legal evidence. An intruder could delete video footage from a camera, or even download and manipulate it before uploading it again. This is a danger for many of these cameras because they don’t cryptographically ‘sign’ their video to prove that it hasn’t been tampered with, Mitchell points out.
Being able to tamper with footage raises some interesting scenarios. Deepfakes, which are videos that have been altered using AI technology, are increasingly realistic. Researchers have demonstrated how they can make individuals say things on video that they didn’t say in reality. Could an attacker use AI to edit a stolen body camera video, alter it to fake a conversation, and then upload it to a compromised camera?
The US government is worried enough about deepfakes to devote research funding to it, and the US Defense Advanced Research Projects Agency (DARPA) recently gave the threat more credence when it singled it out as an area of concern.
Or perhaps an attacker could just fake the officer’s voice using ‘Deep Voice’ software instead.
Vulnerabilities such as these raise some interesting legal questions. If a police force has not taken the appropriate measures to protect its footage and cryptographically sign it, could an enterprising defence lawyer one day use these weaknesses in a case to question the chain of custody of video evidence?

Turning a camera into a weapon

The lack of digital signing extends to the firmware that the cameras run too. Failing to check that firmware is authentic is a rookie mistake in IoT devices. If an attacker could get malicious firmware onto a body camera, it would put the device under their control. Because most of these devices connect to desktop or mobile applications that Mitchell said have their own vulnerabilities, it is theoretically possible to hijack a camera through some of these programs.
Body cameras could even become attack vectors for law enforcement networks, Mitchell warned. An attacker could plant a malicious file on a camera via a wireless link. The camera might then download the file to a desktop machine when a police officer connects it back at headquarters and infect the police network, he warned.
In the spirit of responsible disclosure, Mitchell contacted the vendors about these vulnerabilities and has been working with them to fix the issues, he said. In the meantime, it should leave police forces thinking hard about security audits for their wearable devices.