When is a bug not a bug? That’s the question in play with a proof of concept (PoC) published by researcher Marius Tivadar, which can crash several versions of Windows, even if they’re locked, all within seconds of launching the code.
This PoC requires a USB key with a faulty NTFS image on it to be physically inserted into a Windows PC that also has autoplay enabled. Regardless of the privilege level currently active (from user to administrator), seconds after the target PC tries to read data on the USB stick, the dreaded blue screen of death (BSOD) occurs, crashing the computer.
That’s why Tivadar classifies this bug as a denial of service attack, but a crash is as far as this specific issue goes, and at no point does any privilege escalation or unauthorized data access occur.
Tivadar says he reached out to Microsoft in July 2017 to disclose his findings, all in the hope that Microsoft would officially give this security issue a CVE and start working on a patch to fix the problem.
But because this bug requires a USB key to be physically inserted into a machine to work, Microsoft responded that this finding didn’t “meet the bar” for issuing a security patch – so no CVE and no patch will be forthcoming.
At the time of this writing, according to Tivadar, this issue remains unresolved, and his PoC bug still causes Windows BSODs even in the most recent version of the operating system.
This has stirred an interesting debate about whether the mere existence of a PC-crashing bug automatically merits a robust response and patch from Microsoft. Tivadar’s PoC works and that’s not in dispute by anyone – it’s what to do about it that’s in question.
Microsoft’s reason for rejecting this security issue for a CVE and patch response is, according to Tivadar, that it requires physical access to a machine to work. If an attack requires physical access to a machine, it’s not easily replicable or weaponizable at scale.
Plus, if you have physical access to a machine and you’re looking to cause problems, you can do a lot more than just cause it to crash.
That’s all well and good, says Tivadar, but it’s just as much the principle of the thing that seems to be of concern, especially Microsoft’s apparent dismissal of the bug due to physical access requirements. Writes Tivadar on his GitHub documentation page:
As a security researcher, I think that every vulnerability that requires physical access and/or social engineering is important. We all know the stories Kevin Mitnick taught us regarding social engineering, so yes, these types of bugs are important.
Where do you fall in this debate? Is Microsoft’s response reasonable, or is it leaving Windows users at risk with their refusal to patch this issue?