Earlier this week, the US Department of Justice unsealed an indictment against a group of hackers known as Sandworm. The document charged six hackers working for Russia’s GRU military intelligence agency with computer crimes related to half a decade of cyberattacks across the globe, from sabotaging the 2018 Winter Olympics in Korea to unleashing the most destructive malware in history in Ukraine. Among those acts of cyberwar was an unprecedented attack on Ukraine’s power grid in 2016, one that appeared designed to not merely cause a blackout, but to inflict physical damage on electric equipment. And when one cybersecurity researcher named Mike Assante dug into the details of that attack, he recognized a grid-hacking idea invented not by Russian hackers, but by the United State government, and tested a decade earlier.
The following excerpt from the book SANDWORM: A New Era of Cyberwar and the Hunt for the Kremlin’s Most Dangerous Hackers, published in paperback this week, tells the story of that early, seminal grid-hacking experiment. The demonstration was led by Assante, the late, legendary industrial control systems security pioneer. It would come to be known as the Aurora Generator Test. Today, it still serves as a powerful warning of the potential physical-world effects of cyberattacks—and an eery premonition of Sandworm’s attacks to come.
On a piercingly cold and windy morning in March 2007, Mike Assante arrived at an Idaho National Laboratory facility 32 miles west of Idaho Falls, a building in the middle of a vast, high desert landscape covered with snow and sagebrush. He walked into an auditorium inside the visitors’ center, where a small crowd was gathering. The group included officials from the Department of Homeland Security, the Department of Energy, and the North American Electric Reliability Corporation (NERC), executives from a handful of electric utilities across the country, and other researchers and engineers who, like Assante, were tasked by the national lab to spend their days imagining catastrophic threats to American critical infrastructure.
At the front of the room was an array of video monitors and data feeds, set up to face the room’s stadium seating, like mission control at a rocket launch. The screens showed live footage from several angles of a massive diesel generator. The machine was the size of a school bus, a mint green, gargantuan mass of steel weighing 27 tons, about as much as an M3 Bradley tank. It sat a mile away from its audience in an electrical substation, producing enough electricity to power a hospital or a navy ship and emitting a steady roar. Waves of heat coming off its surface rippled the horizon in the video feed’s image.
Assante and his fellow INL researchers had bought the generator for $300,000 from an oil field in Alaska. They’d shipped it thousands of miles to the Idaho test site, an 890-square-mile piece of land where the national lab maintained a sizable power grid for testing purposes, complete with 61 miles of transmission lines and seven electrical substations.
Now, if Assante had done his job properly, they were going to destroy it. And the assembled researchers planned to kill that very expensive and resilient piece of machinery not with any physical tool or weapon but with about 140 kilobytes of data, a file smaller than the average cat GIF shared today on Twitter.
Three years earlier, Assante had been the chief security officer at American Electric Power, a utility with millions of customers in 11 states from Texas to Kentucky. A former navy officer turned cybersecurity engineer, Assante had long been keenly aware of the potential for hackers to attack the power grid. But he was dismayed to see that most of his peers in the electric utility industry had a relatively simplistic view of that still-theoretical and distant threat. If hackers did somehow get deep enough into a utility’s network to start opening circuit breakers, the industry’s common wisdom at the time was that staff could simply kick the intruders out of the network and flip the power back on. “We could manage it like a storm,” Assante remembers his colleagues saying. “The way it was imagined, it would be like an outage and we’d recover from the outage, and that was the limit of thinking around the risk model.”