Meet the new year, same as the old year – at least when it comes to the debate over encryption.
Top law enforcement officials – this week it was FBI director Christopher Wray – are still contending that it is possible to give government (and only government) “back-door” access to the encrypted digital devices of alleged criminals, without jeopardizing the encryption of every other device on the planet.
And the response from privacy advocates and encryption experts remains that this is magical thinking – at the level of asserting that if there is just enough good-faith cooperation between the tech sector and law enforcement, it will be possible for pigs to fly.
Wray “picked up where he left off last year,” as the Register put it, in a speech this week at the International Conference on Cyber Security, held at Fordham University in New York.
He made the same arguments he had made a month earlier, on 7 December, before the House Judiciary Committee, that selective encryption access is possible without jeopardizing its effectiveness for everybody, and that the need for it is beyond urgent, to protect innocent citizens from criminals and terrorists who are using encrypted devices to “go dark.”
And those arguments pick up where his predecessor, James Comey, left off. They also amount to something of a tag team approach with Assistant Attorney General Rod Rosenstein, who we reported also made the same arguments multiple times last year.
By now, they are familiar. First up are frightening statistics – the agency has been unable to crack thousands of devices. Wray said in 2017 the number was 7,775 – more than half of those that the agency sought to access – with “lawful authority to do so.”
Second, even though the metadata from those devices – “transactional” information about calls, texts and messages – is available, that is not nearly enough.
…for purposes of prosecuting terrorists and criminals, words can be evidence, while mere association between subjects isn’t evidence.
Third, that the world has changed in the last decade to the point that terrorists, child predators, nation states and others can operate almost freely under the cloak of unbreakable encryption. This, Wray said, is “a major public safety issue.”
This problem impacts our investigations across the board – human trafficking, counterterrorism, counterintelligence, gangs, organized crime, child exploitation, and cyber.
And fourth, that what he, Rosenstein and Comey have all called “responsible encryption,” is possible – that the makers of encrypted devices can create a secret key to defeat it (that they don’t even have to give to the government!) which can then be used when law enforcement comes bearing a locked phone and a warrant.
Wray’s example was the chat and messaging platform Symphony, used by major banks, in large part because it guarantees “data deletion.” He said bank regulators were concerned that this could hamper their investigations of Wall Street.
So, he said, four New York banks reached an agreement:
They agreed to keep a copy of all e-communications sent to or from them through Symphony for seven years. The banks also agreed to store duplicate copies of the decryption keys for their messages with independent custodians who aren’t controlled by the banks. So the data in Symphony was still secure and encrypted – but also accessible to regulators, so they could do their jobs.
Wray insisted this does not mean that the FBI is seeking a back door, “which I understand to mean some type of secret, insecure means of access.”
But that is exactly what civil liberties and encryption experts say he is seeking. Bruce Schneier, CTO of IBM Resilient Systems and an encryption expert, has said many times that it is absurd to think that encryption can, “work well unless there is a certain piece of paper (a warrant) sitting nearby, in which case it should not work.”
Mathematically, of course, this is ridiculous. You don’t get an option where the FBI can break encryption but organized crime can’t. It’s not available technologically.
Schneier has likened the selective encryption key argument to poisoning all the food in a restaurant – you might poison a terrorist, but you would also poison every other innocent person eating there.
And Tim Cushing, writing last month in Techdirt after Wray’s appearance before the House Judiciary Committee, contends that the number of locked phones held by the FBI is meaningless because, “there’s no context provided by the FBI, nor will there ever be.”
The FBI is unwilling to divulge how many accessed phones are dead ends and how many cases it closes despite the presence of a locked device.
Not to mention, as numerous critics have indeed highlighted, that the government’s record of securing its own crucial data is, well, spotty at best. There are the continuing dumps by Wikileaks of hacking tools held by the CIA and NSA (and not shared with the private sector). There was the catastrophic hack of the federal Office of Personnel Management (OPM) several years ago that exposed the personal information of about 22 million current and former federal employees.
As the FBI acknowledged, it has been able to defeat the encryption on a phone used by one of the terrorist shooters in December 2015 at a San Bernardino nightclub through the help of a “third party,” reportedly the Israeli mobile forensics firm Cellebrite, which it is said to have paid $900,000.
The FBI also has a department called the National Domestic Communications Assistance Center (NDCAC), to provide technical assistance to local law enforcement agencies. And one of its goals is to make tools like Cellebrite’s services, “more widely available” to state and local law enforcement.
But the fundamental issue will likely be resolved only through legislative action or a court decision. Rosenstein said last fall that he would like to see the matter come to the courts.
I want our prosecutors to know that, if there’s a case where they believe they have an appropriate need for information and there is a legal avenue to get it, they should not be reluctant to pursue it. I wouldn’t say we’re searching for a case. I’d say we’re receptive, if a case arises, that we would litigate.
James Moran
Why the U.S.alphabet agencies can’t come up with their own technology to defeat encryption or unlock one phone is beyond me. We’re in trouble when the FBI has to rely on a foreign company like Israeli Celebrite.
jonathanpdx
Giving the government access to encryption only to be used when “authorized” reminds me of the tale of the frog and the scorpion…we all know how that ended.
Greybeard
In related news, water is wet, pain hurts, and lying makes it harder to determine the truth.
Time
Jim
I’m a firm believer that encryption can have two methods of decryption, and do so securely. Encryption “experts” who claim otherwise are simply thinking inside a self-imposed box.
But, the problem for law enforcement isn’t criminals who have already committed crimes. The law could be changed to make refusal to decrypt equivalent to a guilty plea, as China and Russia do. Whether that would be a good thing is a question for another day.
No, the real problem is that the big boys they mention (terrorists, drug dealers, and child porn networks) aren’t going to use the government-decryptable encryption. They’ll simply develop their own that is still uncrackable. For those people, there’s really no good answer, except to hope that they make a mistake.
Paul Ducklin
The US tried encryption with “key escrow” for law enforcement in the 1990s – see Clipper chip and Skipjack algorithm. Didn’t work out. Problems: [1] people didn’t much like the idea, [2] what to do with the giant cupboard full of escrowed keys? (And, one imagines, [3] how to survive the likely lawsuits if that cupboard ever sprung a leak.)
Jim
Yeah, that won’t work. The cupboard springing a leak is the key issue: not all members of law enforcement can be trusted. Plus, it’s not just the bad ones that are an issue; cops make mistakes, too, and even a small one with escrowed keys is a potential disaster.
It would have to be an algorithmic equation that always has exactly two unique solutions, rather than just one. Or, a two-layer encryption of some kind. Better minds than I would have to work it.
But, for me, escrowing keys isn’t the answer.
But, even if such an animal could be created (I think it can, but emphasize THINK), it wouldn’t solve the problem they’re using to justify it. But, it might work on lower-level thugs, so it’s worth exploring (IMO).
delayedthoughtengineering
The criminals that the government wants to uncover won’t have to develop anything. It’s already here, ready to go, in use, now.
The government is indeed after criminals who have already committed crimes, even if the crime merely falls under “conspiracy”, such as “conspiracy to commit murder”.
Jim
Correct. Their excuse for needing it doesn’t fly.
David M
Sadly for us plebs, the Julian Assange saga, including the UN finding that Julian Assange was/is arbitrarily detained by Sweden and the UK, has proven beyond the shadow of a doubt that governments are not trustworthy of the status of holding all the keys to people’s lives.
Governments: you want the trust? Begin to earn it.
Andrew
Governments are untrustworthy, but all your example proves is that the UN makes some very daft findings at times.
Anonymous
So, they want a key that could break secure encryption in case they need to go after criminals?
It means that none of their employees should ever turn to be (a) a whistleblower “wikileaks style” or (b) someone engaging in criminal activities. While neither are very common, both have occurred in the past. It would require only one person to wreck huge parts of the economy (ecommerce, banks, …) without stopping the criminals who can develop their own unbreakable encryption…
Lee
NSA says it all and everything that came out in regards to Edward Snowden sums up the Government and the lies.
After that incident I don’t see why anyone would trust what a Government says again!
Kyle
the government will definitely only use this kind of thing responsibly, look at stingrays and how they are used for “serious crimes only” like the petty thief.