Site icon Sophos News

FBI “should not be reluctant” to challenge encryption in court

The 2016 FBI vs Apple battle in federal court over government access to encrypted devices never settled the issue. When a contractor hired by the FBI was able to break into the iPhone of a mass shooter, the case became moot.

But there are, according to the US Department of Justice (DoJ), thousands more locked phones that it contends it has a right to access. So it probably shouldn’t be a surprise that the DoJ and Silicon Valley are likely headed for another collision in court, courtesy of Deputy US Attorney General Rod Rosenstein.

Rosenstein has been giving a lot of speeches lately, about “responsible encryption,” which he defines as the kind that can be defeated for any law enforcement agency bearing a warrant, but is otherwise bulletproof against anyone but the user.

And encryption experts have said just as frequently that this is magical thinking –  that it is impossible to have encryption work effectively if there is a way to defeat it, not least because that method will fall into the hands of hackers sooner rather than later, making every device vulnerable.

The signal that the conflict is headed to federal court again came this past week, just a couple of days after the FBI announced it had not been able to break into the iPhone of Devin Patrick Kelley, the shooter in the gun massacre in Sutherland Springs, Texas.

Rosenstein, in a lengthy interview with Politico Pro, said:

I want our prosecutors to know that, if there’s a case where they believe they have an appropriate need for information and there is a legal avenue to get it, they should not be reluctant to pursue it. I wouldn’t say we’re searching for a case. I’d say we’re receptive, if a case arises, that we would litigate.

Which sounds a lot like “searching” and “receptive to” are pretty much the same thing.

And that, of course, is because Round 1 ended without settling the fundamental conflict. It was launched after a mass shooting at in San Bernardino, California, in December 2015, and the FBI was unable to unlock the iPhone of one of the shooters.

A couple of months later, a federal judge asked Apple to provide “reasonable technical assistance” to the FBI, which meant providing a way around the system that locks all the data on the phone after ten incorrect password attempts.

Apple CEO Tim Cook refused, saying it would amount to providing the FBI with a master key – if they could unlock that phone, they could do it to any other.

The showdown ended when the FBI said it had been able to access the phone with the help of a third party – reportedly the Israeli mobile forensics firm Cellebrite, although that was never confirmed by the agency. This past September, a federal court ruled that the agency did not have to make the name of the company public because it would make the company a prime target of hackers and also threaten national security.

But the conflict remains. Rosenstein has said there are more than 7,000 phones in law enforcement custody that remain locked, and told Politico Pro that tech companies are, “moving in favor of more and more warrant-proof encryption.”

But, as Ars Technica noted last week, the DoJ and other law enforcement agencies, including the FBI, are working on defeating encryption with the help of Cellebrite or firms like it. Within the FBI is a department called the National Domestic Communications Assistance Center (NDCAC), which gives technical assistance to local law enforcement agencies.

The most recently published minutes of the NDCAC, from May 2017, said one of the department’s goals is to make tools like Cellebrite’s services “more widely available” to state and local law enforcement.

That’s already being done – in a sextortion case in Miami earlier this year, the NDCAC gave money to local law enforcement to pay Cellebrite to unlock a seized iPhone.

But that kind of game could be both expensive and complicated, of course. Reportedly the FBI paid Cellebrite about $900,000 to unlock a single phone. And, as the makers of digital devices improve the security of encryption, it may take considerable time, and increased expense for companies like Cellebrite to continue breaking it.

So the debate before federal judges will likely sound a lot like the one playing out in speeches, blogs and interviews. On the law enforcement side, those like former FBI director James Comey and now Rosenstein argue that it should be possible for companies like Apple to create a “key” to defeat encryption when law enforcement has a warrant to search a device. They say those companies don’t have to give the key to government – they can protect it within their own organization.

Rosenstein has argued in his speeches that tech companies already provide access to encrypted data through things like the management of security keys and operating system updates. This past week, he compared it to door locks for a house. “People want to secure their houses, but they still need to get in and out,” he told Politico Pro. “Same issue here.”

But encryption experts, noting that there is no such thing as bulletproof security, say if such a key exists, it will soon be in the hands of everybody else as well. Which would be like everybody getting the key to your house – and every other house.

Bruce Schneier, CTO at IBM Resilient and an encryption expert, has called Rosenstein’s reasoning “absurd” a number of times. Last year, in a paper sponsored by the Berkman Center for Internet & Society, he used a different image:

Compare this with the tactic of secretly poisoning all the food at a restaurant. Yes, we might get lucky and poison a terrorist before he strikes, but we’ll harm all the innocent customers in the process. Weakening encryption for everyone is harmful in exactly the same way.

Rosenstein continues to argue that right now, the cost of strong (what he calls “irresponsible”) encryption is too great.

There is a cost to having impregnable security, and we’ve talked about some of the aspects of that. The cost is that criminals are going to be able to get away with stuff, and that’s going to prevent us in law enforcement from holding them accountable.

It is, of course, good politics to sell an encryption backdoor as a way to prevent terrorism, or to hold terrorists accountable. But good politics doesn’t necessarily make good law.

 

Exit mobile version