Officials from the FBI and the Department of Justice (DOJ) have been meeting with security researchers, working out how to get around encryption during criminal investigations, according to the New York Times.
Based on those meetings, DOJ officials are convinced they can get at encrypted data “without intolerably weakening the devices’ security against hacking,” the NYT reports, and thus are renewing their push for a legal mandate that would force tech companies to build encryption backdoors.
Sources familiar with the matter say that there have been workshops held on the subject by Daniel Weitzner, a computer science professor at the Massachusetts Institute of Technology (MIT).
The meetings have been attended by technologists that have included Ray Ozzie, a former chief software architect at Microsoft; Stefan Savage, a computer science professor at the University of California, San Diego; and Ernie Brickell, a former chief security officer at Intel, all three of whom are working on techniques to help police get around encryption during investigations.
Savage said the meetings have focused on finding what the NYT called “a safe enough way” to unlock data on encrypted devices, as opposed to the separate matter of decoding intercepted messages sent via encrypted communications services, like Signal and WhatsApp. The newspaper quoted him:
The stuff I’ve been thinking about is entirely the device problem. I think that is where the action is. Data in motion and the cloud are much harder to deal with.
Presentations from Ozzie, Savage and Brickell were included in a report released last month by a National Academy of Sciences committee following an 18-month study of the encryption debate.
According to the NYT, Ozzie said that the researchers recognize that “this issue is not going away” and hence are trying to foster “constructive dialogue” rather than declaring that there’s no possible solution.
Ozzie’s spot-on when it comes to this issue not going away: law enforcement for years have been lobbying for a way to overcome what they call the “going dark” problem. The phrase refers to the “enormous and increasing number of cases that rely on electronic evidence” that law enforcement can’t get at in spite of having the legal authority to do so, as FBI Director Christopher Wray described it during a speech in Boston earlier this month.
From his prepared remarks:
I recognize this entails varying degrees of innovation by the industry to ensure lawful access is available. But I just don’t buy the claim that it’s impossible.
In that speech, Wray picked up from where he left off in January, when he called unbreakable encryption a “public safety issue,” citing 7,775 devices that the FBI couldn’t crack in 2017 – more than half of those that the agency sought to lawfully access…
…which in turn picked up from where his predecessor, James Comey, left off… which also followed Assistant Attorney General Rod Rosenstein having made the same arguments multiple times last year.
During both his speeches, Wray has referenced a chat and messaging platform called Symphony used by a group of major banks and marketed as offering something called “guaranteed data deletion.”
Pushed by a state regulator, four banks agreed to keep a copy of all communications sent to or from them through Symphony for seven years, Wray said. They also agreed to give copies of their encryption keys to independent law firms.
Wray:
So at the end, the data in Symphony was still secure, still encrypted, but also accessible to the regulators so they could do their jobs. I’m confident that by working together and finding similar areas to agree and compromise, we can come up with solutions to the Going Dark problem.
Maybe it worked for four banks, but according to the NYT, Symphony wouldn’t work for millions of ordinary smartphone users. Another approach that’s particularly caught the government’s eye is a system that would…
…generate a special access key that could unlock… data without the owner’s passcode. This electronic key would be stored on the device itself, inside part of its hard drive that would be separately encrypted – so that only the manufacturer, in response to a court order, could open it.
Nope, that wouldn’t work, according to Susan Landau, a Tufts University computer security professor. In a post she wrote on Lawfare, Landau said that the government is ignoring the “technical realities of what is possible to build securely – and what is not” when it argues that Silicon Valley can securely provide updates that undo security protections.
The updating process, modified to be used multiple times a day rather than a few times a year, is susceptible to subversion. The process could be automated, but that presents a security risk. It’s prudent to have eyes on the process of undoing the protections of someone’s phone.
A more likely model is that multiple people – a lawyer (to examine the court order requiring the phone unlock) and an engineer – would work together to approve a “security-undo” update. Letting so many people access the server that authorizes updates introduces human risks to the system. Risks may arise from malfeasance or from sloppiness – or, most likely, a combination of both.
Pfft! Details. The government’s attitude: perfection is the enemy of progress.
The NYT quoted Brickell, the former Intel official, who said that forcing tech companies to equip devices with powerful new keys to function “is a difficult problem.” But it’s not insurmountable, he said:
Let’s keep working on it. But let’s not let the desire for a perfect solution get in the way of one that would help.
At any rate, the National Academy of Sciences’ report concluded that the proposed encryption-workaround schemes aren’t ready for prime time:
[The] proposed encryption schemes are not considered ready for deployment until they have undergone careful scrutiny by experts regarding their effectiveness, scalability, and security risks and been subject to real-world testing at realistic scale in the relevant contexts.
In a statement sent to the NYT and Ars Technica, Apple Senior Vice President for Software Engineering Craig Federighi said that proposals to weaken security “make no sense.”
We’re continuously strengthening the security protections in our products because the threats to data security intensify every day.
Proposals that involve giving the keys to customers’ device data to anyone but the customer inject new and dangerous weaknesses into product security. Weakening security makes no sense when you consider that customers rely on our products to keep their personal information safe, run their businesses, or even manage vital infrastructure like power grids and transportation systems. Ultimately protecting someone else’s data protects all of us so we need to move away from the false premise that privacy comes at the cost of security when in truth, it’s a question of security versus security.
Jeff
Apple’s Craig Federighi brings some very good points to the table, especially regarding power grids and transportation. We hear from the government about the “public good” of having a key to decrypt devices to protect the public from crime. But the criminal element gaining access to the power grid would certainly not be a “public good”.
Spryte
“Maybe it worked for four banks”… But how do their clients feel about the possibility of their information being part of some TLA’s data dump from their bank?