All you bug hunters out there are about to get a nice Christmas gift – the US federal government finally wants to hear from you. Unhelpful websites and cybersecurity departments will soon be a thing of the past, thanks to a new missive from the Cybersecurity and Infrastructure Agency (CISA).
The Agency, which is part of the Department of Homeland Security, issued a surprising tweet on 27 November announcing that it would force federal agencies to be welcoming and responsive to cybersecurity bug reports from the general public:
Binding Operational Directive 20-01 would finally give ‘helpful hackers’ a sense of legitimacy when reporting bugs to federal government agencies in the US, solving some problems that CISA admits to pretty freely in the document. It says:
Choosing to disclose a vulnerability can be an exercise in frustration for the reporter when an agency has not defined a vulnerability disclosure policy – the effect being that those who would help ensure the public’s safety are turned away.
The directive acknowledges that researchers often don’t know how to report a bug when agencies don’t include an authorized disclosure channel in the form of a webpage or email address. They shouldn’t have to search out security employees’ personal contact information, it points out.
Communication after a bug report is just as important, CISA says. An inadequate response to a bug report, or no response at all, may prompt a researcher to report the bug elsewhere outside the agency’s control.
Perhaps the most egregious mistake that agencies make is threatening legal action. The directive admits that the federal government has a reputation for being heavy-handed and defensive in response to bug reports. Threatening language warning against unauthorized use can also choke off a useful stream of bug reports, it says.
The report draws a distinction between a vulnerability reporting program and a paid bug bounty initiative. While often useful, the latter isn’t mandatory, it says.
What is now mandatory is a system to receive unsolicited reports about security bugs. That means updating the .gov
domain registrar with a security contact field for each registered domain. Agencies must use trained staff to monitor those email addresses.
Agencies must make updates within 15 days of the directive’s enforcement. Then, 180 days after it comes into play, they must publish an official vulnerability disclosure policy on their website defining which systems are in scope, the types of testing allowed, and a description of how to submit vulnerability reports.
The policy must also tell researchers when they can expect a response, and commit to letting them know what’s happening as the agency fixes the bug.
Finally, it must promise not to sue security researchers for reporting security flaws.
The document forces agencies to include all newly launched internet-accessible systems in the future, and all existing internet-accessible systems must be included within two years.
Agencies can’t just hold onto bugs indefinitely under the terms of the directive, because their policies must allow researchers to report bugs elsewhere after a reasonable time period.
CISA also protects bug reports against use by spooks. It explicitly forbids agencies from funneling them through to the Vulnerabilities Equities Process (VEP). This is a government initiative that decides whether to use security bugs for the greater public good, or to keep them secret as potential weapons against others.
While broadly welcomed, CISA’s directive proposal met a measured response from Katie Moussouris, CEO of security company Luta Security, who said that the timelines were too long:
Her twitter thread warns agencies against just turning to a third-party bug bounty program as they scramble to meet the directive’s requirements. Still, as she mentions, some federal organizations have used them in the past. The Pentagon famously used HackerOne’s bug bounty services to find bugs in its systems.
Although the government may move at a glacial pace to implement this new initiative, it’s better than some policies in the private sector, as we saw when researchers had problems reporting bugs to open source projects like Bitcoin Cash.