Sophos News

Should governments keep vulnerabilities secret?

The debate over how much US intelligence agencies should hoard, and how they should use, vulnerabilities found in software used by their own citizens will probably never end.

But two recent research papers, presented together at Black Hat, argue that data analysis should carry more weight than, “speculation and anecdote” in setting government policy on the matter.

By now, the cases for both sides are well established:

According to one, intelligence agencies need to maintain a secret stash of vulnerabilities if they are to have any hope of penetrating the communications or cyber weapons of criminals, terrorists and hostile governments. It is a crucial element of protecting the homeland.

According to the other, the problem is that they don’t always remain secret. And if those vulnerabilities are leaked or otherwise discovered by malicious actors, they can be exploited to attack millions of innocent users, or critical systems, before they are patched. Which can be very damaging to the homeland.

Indeed, the release about a year ago by the so-called Shadow Brokers of a cache of top-secret spying capabilities, presumed to belong to the National Security Agency (NSA), intensified the complaints about government surveillance going well beyond targeting terrorists.

Jason Healey, senior research scholar in cyber conflict and risk at Columbia University’s School of International and Public Affairs and also a senior fellow at the Atlantic Council, noted in a paper last November that of the 15 zero-days in the initial release, several, “were in security products produced by Cisco, Juniper, and Fortinet, each widely used to protect US companies and critical infrastructure, as well as other systems worldwide.”

More recently, starting in March, the WikiLeaks “Vault 7” document dump exposed the CIA’s efforts to exploit Microsoft and Apple technology to enable surveillance.

Without wading into the specific debate over keeping vulnerabilities secret, several researchers at Harvard Kennedy School’s Belfer Center for Science and International Affairs recently published a paper they hoped would, “call for data and more rigorous analysis of what it costs the intelligence community when they disclose a vulnerability and what it costs citizens and users when they don’t.”

It reported that their analysis of a dataset of more than 4,300 vulnerabilities found that, “(vulnerability) rediscovery happens more often than previously reported”.

They went on to say:

When combined with an estimate of the total count of vulnerabilities in use by the NSA, these rates suggest that rediscovery of vulnerabilities kept secret by the US government may be the source of as many as one-third of all zero-day vulnerabilities detected in use each year

Another paper, by the RAND Corporation is aimed at establishing, “some baseline metrics regarding the average lifespan of zero-day vulnerabilities, the likelihood of another party discovering a vulnerability within a given time period, and the time and costs involved in developing an exploit for a zero-day vulnerability.”

According to their findings, zero-day exploits and their underlying vulnerabilities, “have a rather long average life expectancy (6.9 years). Only 25% do not survive to 1.51 years, and only 25% live more than 9.5 years.”

The combined Black Hat presentation focused in part on what both groups studied – the Vulnerability Equities Process (VEP), which helps determine if “a software vulnerability known to the US government will be disclosed or kept secret.”

The Belfer Center group said they did not intend to “relitigate” VEP debate in their paper. But given that they and those at RAND have done some of the “rigorous analysis” they called for, why not at least weigh in on it?

Trey Herr, a postdoctoral fellow with the Belfer Center’s Cyber Security Project and a co-author of that paper, said that while government use of vulnerabilities is “necessary,” he thinks government has not “walked the line very well” between disclosure and secrecy.

He said that even former NSA director Gen. Michael Hayden has acknowledged that if the agency can’t keep its secret capabilities secure, it shouldn’t be allowed to have them.

But Herr, in a post on the Lawfare blog, said he is hopeful that the recently introduced PATCH (Protecting our Ability to Counter Hacking) Act in Congress will “codify” the VEP process, “to facilitate accountability and continuity between administrations.”

He said the bill is significant because, “it marks the first time that Congress will be actively involved in meaningful discussion about government disclosure of vulnerabilities.”

But Herr also contends that even if the correct balance between disclosure and secrecy is achieved, “that is only a small piece of the puzzle when it comes to software security.”