As an industrial nation transitioning to an information society with digital conflict, we tend to see the technology as the weapon. In the process, we ignore the fact that few humans can have a large-scale operational impact.
But we underestimate the importance of applicable intelligence, the intelligence on how to apply things in the right order. Cyber and card games have one thing in common: the order you play your cards matters. In cyber, the tools are mostly publically available, anyone can download them from the Internet and use them, but the weaponization of the tools occur when they are used by someone who understands how to use the tools in the right order.
In 2017, Gen. Paul Nakasone said “our best [coders] are 50 or 100 times better than their peers,” and asked “Is there a sniper or is there a pilot or is there a submarine driver or anyone else in the military 50 times their peer? I would tell you, some coders we have are 50 times their peers.” The success of cyber operations is highly dependent, not on tools, but upon the super-empowered individual that Nakasone calls “the 50-x coder.”
There have always been those exceptional individuals that have an irreplaceable ability to see the challenge early on, create a technical solution and know-how to play it for maximum impact. They are out there – the Einsteins, Oppenheimers, and Fermis of cyber. The arrival of artificial intelligence increases the reliance of these highly capable individuals because someone must set the rules and point out the trajectory for artificial intelligence at the initiation.
But this also raises a series of questions. Even if identified as a weapon, how do you make a human mind “classified?” How do we protect these high-ability individuals that are weapons in the digital world?
These minds are different because they see an opportunity to exploit in a digital fog of war when others don’t see it. They address problems unburdened by traditional thinking, in innovative ways, maximizing the dual-purpose of digital tools, and can generate decisive cyber effects.
It is this applicable intelligence that creates the process, that understands the application of tools, and that turns simple digital software to digitally lethal weapons. In the analog world, it is as if you had individuals with the supernatural ability to create a hypersonic missile from materials readily available at Kroger or Albertson. As a nation, these individuals are strategic national security assets.
Systemically, we struggle to see humans as the weapon, maybe because we like to see weapons as something tangible, painted black, tan, or green, that can be stored and brought to action when needed.
For America, technological wonders are a sign of prosperity, ability, self-determination, and advancement, a story that started in the early days of the colonies, followed by the Erie Canal, the manufacturing era, the moon landing and all the way to the autonomous systems, drones, and robots. In a default mindset, there is always a tool, an automated process, a software, or a set of technical steps, that can solve a problem or act. The same mindset sees humans merely as an input to technology, so humans are interchangeable and can be replaced.
Super-empowered individuals are not interchangeable and cannot be replaced, unless we want to be stuck in a digital war. Artificial intelligence and machine learning support the intellectual endeavor to cyber defend America, but humans set the strategy and direction.
It is time to see what weaponized minds are, they are not dudes and dudettes; they are strike capabilities.
Jan Kallberg, Ph.D., LL.M., is a research scientist at the Army Cyber Institute at West Point and an assistant professor in the department of social sciences at the United States Military Academy. The views expressed are those of the author and do not reflect the official policy or position of the Army Cyber Institute at West Point, the United States Military Academy, or the Department of Defense.