Our Critical Infrastructure – Their Cyber Range

There is a risk that we overanalyze attacks on critical infrastructure and try  to find a strategic intent where there are none. Our potential adversaries, in my view, could attack critical American infrastructure for other reasons than executing a national strategy. In many cases, it can be as simple as hostile totalitarian nations that do not respect international humanitarian law, use critical American infrastructure as a cyber range. Naturally, the focus of their top-tier operators is on conducting missions within the strategic direction, but the lower echelon operators can use foreign critical infrastructure as a training ground. If the political elite sanctions these actions, nothing stops a rogue nation from attacking our power grid, waterworks, and public utilities to train their future, advanced cyber operators. The end game is not critical infrastructure – but critical infrastructure provides an educational opportunity.

Naturally, we have to defend critical infrastructure because by doing so, we protect the welfare of the American people and the functions of our society. That said, only because it is vital for us doesn’t automatically mean that it is crucial for the adversary.

Cyberattacks on critical infrastructure can have different intents. There is a similarity between cyber and national intelligence; both are trying to make sense of limited information looking at a denied information environment. In reality, our knowledge of the strategic intent and goals of our potential adversaries is limited.

We can study the adversary’s doctrine, published statements, tactics, technics, and events, but significant gaps exist to understand the intent of the attacks. We are assessing the adversary’s strategic intent from the outside, which are often qualified guesses, with all the uncertainty that comes with it. Then to assess strategic intent, many times, logic and past behavior are the only guidance. Nation-state actors tend to seek a geopolitical end goal, change policy, destabilize the target nation, or acquire the information they can use for their benefit.

Attacks on critical infrastructure make the news headline, and for a less able potential adversary, it can serve as a way to show their internal audience that they can threaten the United States. In 2013, Iranian hackers broke into the control system of a dam in Rye Brook, N.Y. The actual damage was limited due to circumstances the hackers did not know. Maintenance procedures occurred at the facility, which limited the risk for broader damage.

The limited intrusion in the control system made national news, engaged the State of New York, elected officials, Department of Justice, the Federal Bureau of Investigations, Department of Homeland Security, and several more agencies. Time Magazine called it in the headline; ”Iranian Cyber Attack on New York Dam Shows Future of War.”

When attacks occur on critical domestic infrastructure, it is not a given that it has a strategic intent to damage the U.S.; the attacks can also be a message to the attacker’s population that their country can strike the Americans in their homeland. For a geopolitically inferior country that seeks to be a threat and a challenger to the U.S., examples are Iran or North Korea; the massive American reaction to a limited attack on critical infrastructure serves its purpose. The attacker had shown its domestic audience that they could shake the Americans, primarily when U.S. authorities attributed the attack to Iranian hackers, making it easier to present it as news for the Iranian audience. Cyber-attacks become a risk-free way of picking a fight with the Americans without risking escalation.
Numerous cyber-attacks on critical American infrastructure could be a way to harass the American society and have no other justification than hostile authoritarian senior leaders has it as an outlet for their frustration and anger against the U.S.

Attackers seeking to maximize civilian hardship as a tool to bring down a targeted society have historically faced a reversed reaction. The German bombings of the civilian targets during the 1940’s air campaign “the Blitz” only hardened the British resistance against the Nazis. An attacker needs to take into consideration the potential outfall of a significant attack on critical infrastructure. The reactions to Pearl Harbor and 9-11 show that there is a risk for any adversary to attack the American homeland and that such an attack might unify American society instead of injecting fear and force submission to foreign will.

Critical infrastructure is a significant attack vector to track and defend. Still, cyberattacks on U.S. critical infrastructure create massive reactions, which are often predictable, are by itself a vulnerability if orchestrated by an adversary following the Soviet/Russian concept of reflexive control.

The War Game Revival

 

The sudden fall of Kabul, when the Afghan government imploded in a few days, shows how hard it is to predict and assess future developments. War games have had a revival in the last years to understand potential geopolitical risks better. War games are tools to support our thinking and force us to accept that developments can happen, which we did not anticipate, but games also have a flip side. War games can act as afterburners for our confirmation bias and inward self-confirming thinking. Would an Afghanistan-focused wargame design from two years ago had a potential outcome of a governmental implosion in a few days? Maybe not.

Awareness of how bias plays into the games is key to success. Wargames revival occurs for a good reason. Well-designed war games make us better thinkers; the games can be a cost-effective way to simulate various outcomes, and you can go back and repeat the game with lessons learned.
Wargames are rules-driven; the rules create the mechanical underpinnings that decide outcomes, either success or failure. Rules are condensed assumptions. There resides a significant vulnerability. Are we designing the games that operate within the realm of our own aggregated bias?
We operate in large organizations that have modeled how things should work. The timely execution of missions is predictable according to doctrine. In reality, things don’t play out the way we planned; we know it, but the question is, how do you quantify a variety of outcomes and codify them into rules?

Our war games and lessons learned from war games are never perfect. The games are intellectual exercises to think about how situations could unfold and deal with the results. In the interwar years, the U.S. made a rightful decision to focus on Japan as a potential adversary. Significant time and efforts went into war planning based on studies and wargames that simulated the potential Pacific fight. The U.S. assumed one major decisive battle between the U.S. Navy and the Imperial Japanese Navy, where lines of battleships fought it out at a distance. In the plans, that was the crescendo of the Pacific war. The plans missed the technical advances and importance of airpower, air carriers, and submarines. Who was setting up the wargames? Who created the rules? A cadre of officers who had served in the surface fleet and knew how large ships fought. There is naturally more to the story of the interwar war planning, but as an example, this short comment serves its purpose.

How do we avoid creating war games that only confirm our predisposition and lures us into believing that we are prepared – instead of presenting the war we have to fight?

How do you incorporate all these uncertainties into a war game? Naturally, it is impossible, but keeping the biases at least to a degree mitigated ensures value.

Study historical battles can also give insights. In the 1980s, sizeable commercial war games featured massive maps, numerous die-cut unit counters, and hours of playtime. One of these games was SPI’s “Wacht am Rhein,” which was a game about the Battle of the Bulge from start to end. The game visualizes one thing – it doesn’t matter how many units you can throw into battle if they are stuck in a traffic jam. Historical war games can teach us lessons that need to be maintained in our memory to avoid repeating the mistakes from the past.

Bias in wargame design is hard to root out. The viable way forward is to challenge the assumptions and the rules. Outsiders do it better than insiders because they will see the ”officially ignored” flaws. These outsiders must be cognizant enough to understand the game but have minimal ties to the outcome, so they are free to voice their opinion. There are experts out there. Commercial lawyers challenge assumptions and are experts in asking questions. It can be worth a few billable hours to ask them to find the flaws. Colleagues are not suitable to challenge and the ”officially ignored” flaws because they are marinated in the ideas that established the ”officially ignored” flaws. Academics dependent on DOD funding could gravitate toward accepting the ”officially ignored” flaws, just a fundamental human behavior, and the fewer ties to the initiator of the game, the better.

Another way to address uncertainty and bias is repeated games. The first game, cyber, has the effects we anticipate. The second game, cyber, has limited effect and turns out to be an operative dud. In the third game, cyber effects proliferate and have a more significant impact than we anticipated. I use these quick examples to show that there is value in repeated games. The repeated games become a journey of realization and afterthoughts due to the variety of factors and outcomes. We can then afterward use our logic and understanding to arrange the outcomes to understand reality better. The repeated games limit the range and impact of specific bias due to the variety of conditions.

The revival of wargaming is needed because wargaming can be a low-cost, high-return, intellectual endeavor. Hopefully, we can navigate away from the risks of groupthink and confirmation bias embedded in poor design. The intellectual journey that the war games take us on will make our current and future decision-makers better equipped to understand an increasingly complex world.

 

Jan Kallberg, Ph.D.