PENTAGON: The Pentagon says it wants a revolution and the 2017 budget to be unveiled today funds a host of high-tech weapons, from arsenal planes to Hyper Velocity Projectiles to robots. But for Deputy Defense Secretary Bob Work, the real bleeding edge of innovation is not a weapon, no matter how impressive. It’s a secretive command post with a jawbreaking name in a repurposed building in Colorado Springs — the Joint Interagency Combined Space Operations Center (JICSPOC).
“The JICSPOC in our view is the first operational and organizational construct of the Third Offset Strategy,” Work told me in an exclusive 85-minute interview in his E-ring office. “People say, ‘what’s the Third Offset Strategy about? And they say, ‘oh, it’s about AI [artificial intelligence] and autonomy.’ We say no…. It’s about human-machine collaborative combat networks.”
In other words, Work does not want autonomous robotic warriors like the Terminator. He wants artificial intelligence first and foremost to help humans make decisions. He wants computers to keep an unblinking eye on potential enemies; to sort through gigabytes of big data for actionable intelligence; to see subtle patterns in the movement of troops and satellites; to counter incoming cyberattacks, jamming, and missiles that move too fast for human reflexes; to assemble routine target lists so the humans can concentrate on strategy.
Networks: Our Strength And Weakness
Today, by contrast, digital networks are both the backbone and Achilles’ Heel of the US military. Everyone from foot troops to airplanes to smart bombs depends on GPS to know where they’re going. Commanders send orders wirelessly to both human subordinates and drones. Sensor feeds from multiple sources converge on command center screens. But this invisible architecture is dangerously fragile in the face of hacking, jamming, or someone simply broadcasting on the wrong frequency. When it does work, the system often drowns the human users in data. Our current computers help us get information about the world, but they don’t help us understand it.
“The whole vision of the offset is to make the human better, not to make the machines better,” Work emphasized. “We’re building on the [existing] battle networks that employ conventional weapons, and we’re vastly improving them by utilizing AI and autonomy…to allow humans to make better decisions, to perform better in combat, and to be more effective.”
“That’s what the Third Offset Strategy is all about,” he said.
Hence the crucial role of the nascent JICSPOC. “If there is a war on earth, we expect war to extend into space very early, that our [satellite] constellation will be under threat,” Work said. “Right now, we don’t have the command and control or operational and organizational constructs to allow us to fight that constellation.” That means the military and Intelligence Community need to realize the enemy is about to attack — with a missile, a virus, jamming — and then order satellites to move out of harm’s way, for example, or to assess the damage from a successful attack and reconfigure the constellation to provide the most possible coverage with what remains. It may also mean ordering airborne or ground assets to help fill any gaps caused by the attack.
“The JICSPOC is the experimental platform that is trying to determine what are the command and control things that we need,” Work said: “the different types of space situational awareness, the deep learning machines that will allow us to determine what is happening in the constellation, the human-machine collaboration which is advanced visualization and battle network tools for the commander.”
The “learning machines” Work references are computers with a limited ability to learn from experience and adapt to new situations without humans having to reprogram them every time. Google’s driverless cars use such technology to navigate the infinite and unpredictable combinations of other vehicles, pedestrians, and road conditions they encounter.
In fact, Work sees the rise of automation in the commercial auto industry as a good model for where he wants the military to go. At the most basic level, it’s sensors with the smarts warn you when you’re doing something dangerous. “Human-machine collaboration is the lane departure warning that you get,” he said, “or the beep, beep, beep when you’re backing up and it’s saying you’re getting close to something.”
Now automotive automation is going further, he said, for example with self-parking cars. “We’ve already designed in the car autonomy to park,” he said. “We push the button and we say, ‘I believe.'”
Increasingly, Work continued, we’ll see cars that have collision avoidance sensors that will not only alert their driver to the danger, but will hit the brakes without human intervention. “You delegate to the car to make that decision” when events are moving too fast for you to react, Work said. For those few seconds, “you are out of the loop.”
“Out of the loop” is an unnerving place for humans to be when it comes to military robots. Work makes clear that computers won’t be shooting people on their own accord. “How you develop your combat network to make sure that the human is always in the loop is going to be very, very critical in the Third Offset,” he said. “This is a big, big, moral, legal, political decision. How much do you delegate to a machine?”
In at least three areas, Work says, events move so fast that humans should let computers take action on their own, within strict boundaries. “There’s going to be things like missile defense, electronic warfare, and cyber where you’re going to have to rely on the machines because you’re going to fight those fights at machine speeds,” he said. “These learning machines have the capability to recognize an attack is coming and to fight it off.”
Computers Making Decisions
That said, even if humans can’t respond fast enough to give direction in real time, they can set limits in advance, Work said: “The first thing you’ll tell the learning machine is, here are the parameters of your initial response. This is how much you can do on your own.”
Sometimes that autonomous reaction means simply switching to communications frequencies that aren’t being jammed, or erasing malware from a computer. Other times, it may well mean the computer pulls the trigger on certain kinds of ordnance, especially interceptor missiles — and perhaps one day lasers and rail guns — for missile defense.
That’s not actually new, Work noted. “Right now, as you know, we have an automatic setting on our Aegis Combat System” aboard Navy cruisers and destroyers, he said. “When you have a heavy raid coming into a carrier strike group, you will go automatic. You’ll let the machine decide which threats to go after, how many missiles to shoot, what to shoot at.”
In most areas, however, Work sees the human making the decisions with the computer’s help — what’s sometimes called the “centaur” model of human-machine collaboration.
“We use the F-35 [Joint Strike Fighter] as a good example of human-machine collaboration: That is a flying computer and sensor network that displays everything to the pilot on a helmet,” Work said. “I don’t consider it a fighter plane. I consider it a flying component of an air combat network. And the reason why we believe it is going to be so effective is because of the way it sucks in all the data and presents it to the pilot.”
Another example of such collaboration, Work said, is the Navy’s pair of new reconnaissance aircraft: the manned P-8 Poseidon and the unmanned MQ-4C Triton. The drone provides long-endurance, wide-area surveillance that can tell the P-8 where to look more closely — and, if need be, drop anti-submarine torpedoes. The Air Force’s future Long-Range Strike Bomber will be a manned-unmanned combination in itself, he added, with the manned bomber dropping drones and autonomous weapons.
Learning machines can also help with the unglamorous staff work that sometimes wins or loses wars. Consider Vladimir Putin’s takeover of Crimea with Russian commandos, operating in uniform but without insignia or official acknowledgment. By the time the Western powers figured out what was happening, it was over. “The ‘little green men’ problem is a big data problem,” Work said. “If you can hook into all the social media, if you can track cell phone calls, if you can watch television reports and you can crunch all that data,” he said, the learning machine might be able to find the pattern for you and warn you in time to react.
Or consider the notoriously laborious Air Tasking Order process that allocates aircraft to targets. “A learning machine might be able to automate the ATO for the Air Force, and be able to do it faster, and allow the air combat commander to make quicker decisions,” Work said.
All this computer-aided understanding calls to mind the “Transformation” effort of the late Clinton and early Bush administrations, more formally called the Revolution in Military Affairs (RMA). Both Offset and Transformation try to ride the wave of Moore’s Law, the doubling of processing power every 18 months that has changed our lives — or at least our phones. Both Offset and Transformation wrestle with the combination of precision guided weapons, long-range surveillance, and communications networks to convey target information from sensors to shooters. But they come at this revolution from opposite directions.
Transformation arose from the happy shock of the First Gulf War, when US casualties were a few percent of pre-war predictions because the American combination of precision, surveillance, and networks flattened the Iraqis. In the heady years after the Soviet Union fell, the apostles of transformation sought to make the most of America’s unequalled advantage. Smart bombs were just a fraction of the total dropped in 1991, for example, but are omnipresent now. In 1991, Air Targeting Orders had to be saved to floppy disks and physically flown from headquarters on land to carriers in the Gulf, but today commanders teleconference with each other and watch drone feeds from around the world.
Offset arises from the unhappy realization that the Russian bear is back, China is rising, and they’re rapidly fielding the very combination of precision, surveillance, and networks that was once a US monopoly. Worse, they’re developing tactics and technologies, especially in cyberspace and the radio spectrum, specifically to baffle, blind, or destroy our networked war machine. If our adversaries are learning how to copy and counter our current advantages, we need offset their growing power — hence the name — by finding new advantages.
America’s Enduring Advantage
Second, at least in Work’s version, the Offset Strategy puts humans at the center, not technology. (The importance of people and institutions was another nuance of the original Revolution in Military Affairs idea that largely fell by the wayside over time).
“If you ever hear anybody say that the third offset is about technology, just tell them they’ve got to be crazy,” Work said. “The technology is available to every competitor out there. Most of it’s being driven in the commercial sector, unlike in the Cold War, and so everyone’s going to be able…to try to put these things together.”
America’s unique, enduring advantage, therefore, is not technological. It’s in our people, our institutions, and our culture. “We believe that the third offset builds on three big, big advantages that we now have,” Work told me.
“The first advantage is our level of jointness,” Work said. “In terms of the scale of the joint battle networks that we can assemble, there’s nobody in our class.” For all its interservice rivalries, the US military brings land, sea, air, and now space and cyber forces together better and on a large scale than anyone else in the world. That’s an advantage our military institutions have taken painful decades to develop and which even the smartest, best-funded adversaries can’t replicate overnight.
Then there’s the institutional advantage that comes not just from the Pentagon but from our industrial base: our “proven ability to develop campaign-level systems of systems,” Work said. “We’re pretty damn good at that.” What he’s talking about is the engineering and managerial expertise to put together the technological infrastructure — chiefly the networks — that link together all the individual weapons systems into a coherent whole. Again, this is something that China and Russia, with their notorious corruption, can’t easily replicate.
Underlying it all is our people. Said Work, “our hypothesis, Sydney, is our young man or woman growing up in an iWorld and a democracy who is a little irreverent of authority, is endlessly creative, unafraid to make mistakes, is going to be better than a young man or woman who grew up in the iWorld in an authoritarian regime where their initiative is not necessarily appreciated.”
It’s those young men and women who will ultimately have to make the Offset Strategy’s systems work, whether against China or Russia or someone altogether unexpected. It’s those humans who’ll have to make hard calls. Bob Work’s job is to make sure they have machines smart enough to help.