A runaway strain of malware hit Windows computers Friday and spread through the weekend, rendering hundreds of thousands of computers around the world more or less useless. The big twist: The virus was made possible by U.S. government hackers at the National Security Agency. But the finger-pointing won’t stop there, and it probably shouldn’t.
As the worm, known as WannaCry, has been contained, more free time has opened up in which to argue and assign blame beyond the anonymous hackers who used leaked NSA code to assemble the virus, and whatever party decided to turn it into ransomware. Microsoft isn’t holding back.
In an unusually bold and forthright post by president Brad Smith, the company called out the NSA by name for not just creating, but “stockpiling” — and then, like Cyber Frankenstein, losing all control over — the attacks that made WannaCry possible:
This is an emerging pattern in 2017. We have seen vulnerabilities stored by the CIA show up on WikiLeaks, and now this vulnerability stolen from the NSA has affected customers around the world. Repeatedly, exploits in the hands of governments have leaked into the public domain and caused widespread damage. An equivalent scenario with conventional weapons would be the U.S. military having some of its Tomahawk missiles stolen. And this most recent attack represents a completely unintended but disconcerting link between the two most serious forms of cybersecurity threats in the world today – nation-state action and organized criminal action.
Every software weakness the NSA (or CIA, or FBI) decides to use for itself in total secrecy is necessarily one it won’t share with a company like Microsoft so that it can write and release a software update to keep its customers safe. (Whether or not you see this as a good and necessary thing likely has a lot to do with your opinion of whether the NSA too often prioritizes its ability to hurt adversaries over the privacy and safety of U.S. citizens or over the privacy and safety of people in general).
The government’s official decision to withhold or disclose is driven by something called the Vulnerabilities Equity Process (or VEP), and its exact mechanism is not entirely known. The VEP is meant to balance the advantages gained by keeping a given software vulnerability secret versus the potential risks to the world at large.
When the NSA adds to its arsenal an undisclosed software vulnerability, known as a “zero day,” rather than reporting it to the maker of the software, any common cybercriminal who happens to independently discover it will be free to exploit the security hole for their own ends, sometimes for years and years. Even if everything goes according to plan for the NSA, this sort of stockpiling values the military and intelligence community’s offensive capabilities over the digital safety of, well, literally everyone else, and is rightfully controversial.
But per Microsoft’s point, things aren’t going according to plan recently, and our nation’s secret keepers have been having a lot of trouble keeping their computer weapons away from the likes of the Shadow Brokers and Wikileaks. It’s a true and damning argument on Smith’s part: Whether due to internal leakers or  external attackers, two of the most advanced and secretive spy agencies in the world have seen some of their most prized offensive tools snatched out of the shadows and not only made public, but weaponized against British hospitals, Chinese universities, and FedEx. Congressman Ted Lieu, a rare legislator with any background in computer science, sees WannaCry as an opportunity to overhaul the VEP in favor of more disclosure: “Currently the Vulnerabilities Equities Process is not transparent and few people understand how the government makes these critical decisions,” the California Democrat wrote in a statement as WannaCry raged around the world. “Today’s worldwide ransomware attack shows what can happen when the NSA or CIA write malware instead of disclosing the vulnerability to the software manufacturer.”
The NSA did not create WannaCry. Rather, it discovered weaknesses in various versions of Windows and wrote programs that would allow American spies to penetrate computers running Microsoft’s operating system, and it was one of these programs, codenamed ETERNALBLUE and repurposed by still-unidentified hackers, that allowed WannaCry to spread as quickly and uncontrollably as it did last week. Whether or not you think the causal chain is such that the NSA is in some sense morally responsible, it’s undeniable that without the agency’s work, there is no ETERNALBLUE, and without ETERNALBLUE, there is no May 2017 WannaCry Crisis. In this sense, Microsoft is right–but the blame shouldn’t end there.
Microsoft also did not create WannaCry. But it did create something something nearly as bad: Windows Vista, an operating system so horrendously bloated, broken, and altogether unpleasant to use that many PC users back in 2007 skipped upgrading altogether, opting instead to stick with the outdated Windows XP, a decision that has left many people on that decade-and-a-half-old operating system even today, years after Microsoft stopped updating it.
When Microsoft responded to the startling initial reports of ETERNALBLUE’s public release by noting it had already inoculated Windows against the threat via software patch, it did not mention that XP users were not included. Using an operating system after its expiration date is unwise, but in fairness to the millions of people around the world still using old versions of Windows, expecting consumers to regularly buy expensive software of uncertain quality is unwise too. It’s only relatively recently that Microsoft has started to shake off the stink from Vista (and the confusing Windows 8).
Some of the NSA’s defenders are quick to blame computer owners and IT administrators for not keeping their software current, but less likely to blame Microsoft for writing insecure code, alienating customers with shoddy operating systems and planned obsolescence, or dropping support for older OSes still in wide use. (The fact that Microsoft did actually release a WannaCry security patch for Windows XP over the weekend shows that it’s entirely possible to make old software safer). It can’t be overstated that the choice to let older versions of Windows lapse into a condition of permanent insecurity is as much a business strategy as an engineering decision, and one that leaves Microsoft customers in the lurch when something like WannaCry breaks loose. In the case of a large, high-stakes organization like a hospital or manufacturing plant, upgrading to the next version of Windows isn’t just a matter of waiting for the progress bar to fill, but a nightmarish web of compatibility issues with specialized hardware and niche, 3rd party software. If letting a computer network in you administer run Windows XP is negligent, it’s surely a negligence that pales compared to losing a military cyberweapon, or abandoning vulnerable customers whose computers work more or less fine.
The NSA surely wants to do its work in full secrecy, undisturbed as much as possible by obligations to anyone or anything else–it’s the business they’re in. Microsoft surely wants to continue to sell successive versions of Windows every several years and gradually forget about its earlier attempts–it’s the business they’re in. But these two agendas, of militarism, absolute secrecy, and software profit maximization create an environment that allows something like WannaCry to stomp all over the globe, hobbling hospitals and train stations in its wake.