Your Cart

Should US Unleash War Robots? Frank Kendall Vs. Bob Work, Army

Posted by Sydney J. Freedberg Jr. on

Army photo

The man in the loop — a soldier mans a robot-carried machinegun at the Army’s PACMAN-I experiment.

WILLIAMSBURG, Va.: The Pentagon’s top weapons buyer, Frank Kendall, warned today that the US might hobble itself in future warfare by insisting on human control of thinking weapons if our adversaries just let their robots pull the trigger. Kendall even worries that Deputy Defense Secretary Bob Work is being too optimistic when Work says humans and machines working together will beat robots without oversight.

These are unnerving ideas — and top Army leaders swiftly responded with concern that robots would  shoot civilians if you take the human out of the loop. This is what Vice Chairman of the Joint Chiefs Paul Selva calls the Terminator Conundrum: “When do we want to cross that line as humans? And who wants to cross it first? Those are really hard ethical questions.” They are also a fundamental question of combat effectiveness.

“Even in a more conventional conflict, we’re quite careful about not killing innocent civilians,” said Kendall. “I don’t expect our adversaries to all behave that way, and the advantage you have if you don’t worry about that as much, is you make decisions more quickly” — which can spell the difference between victory and defeat.

Air Force Photo

Frank Kendall visiting Bagram airbase in Afghanistan.

Kendall recounted an Israeli experiment he saw recently when a tank rigged to detect incoming fire automatically turned the turret and aimed the gun at the target. But then it waited for the human to pull the trigger. “It would take nothing to automate firing back, nothing,” Kendall told the Army Innovation Summit here, noting Israeli Defense Force experiments with autonomy. “Others are going to do it. They are not going to be as constrained as we are, and we’re going to have a fundamental disadvantage if we don’t.”

“Bob Work’s view, for the near future at least, (is that) humans with machines will make better decisions than machines (will) alone,” Kendall continued. “He may right about that; there are instances where that is true. I don’t know how much he’s right about or how long it will be true.”

As computers keep getting better, Kendall said, “the trend is certainly against us” — “us” as in “human beings.”

While Work is right to emphasize the importance of autonomy in the Third Offset Strategy to retain America’s high tech advantage, Kendall is “not as sure as sure that he’s right”  human-machine teaming is superior to machines untethered to human judgment,

Because software proliferates easily around the world, Kendall said, “what I’m afraid of is that other people will have comparable autonomy and they’ll be much more ruthless about how they employ it than we are, and I think that might be a fundamental disadvantage.”

Robert Work

Robert Work

This isn’t a theoretical question; it’s one we must answer now. “Our policy is there has to be meaningful human control over lethal systems,” said Kendall. “Defining that is not easy. If you take a smart seeker, which you send out on a missile to look at few hundred square meters, for example, to pick out the armored vehicle (and) then autonomously attack that vehicle, is that meaningful human control or not?”

Do you trust the computer to tell friend from foe from bystander, knowing that pattern-recognition algorithms not only make mistakes, but can make mistakes no human ever would? Kendall cited DARPA director Arati Prabhakar’s favorite example, of a cutting-edge imagery analysis program that identified a baby holding a toothbrush as “young boy…holding a baseball bat.” The problem is less funny if the software confuses a child with a stick for a terrorist with a gun.

The Army Pushes Back

Three top Army leaders — the No. 2 civilian and two four-star generals — pushed back politely after Kendall spoke. “The fundamental character of the American soldier is that we protect people and we don’t take collateral damage lightly,” said Army Undersecretary Patrick Murphy, an Iraq veteran himself, told reporters. “We understand our competitors and the terrorists out there don’t have the same core value set, but that’s not something we’re going to compromise.”

Army photo

Patrick Murphy

“We still have the rules of war,” added Army Materiel Command chief Gen. Dennis Via. “It still involves the human dimension.”

We shouldn’t just throw out our current checks and balances on collateral damage, Gen. David Perkins, head of Training & Doctrine Command, said. “Autonomy is not something completely new: If you have an artillery piece…once that round has left the tube, it’s autonomous. (If) all of a sudden, a school bus drives in front of it… I can’t do anything about it,” Perkins said. Even with future weapons that had “an order of magnitude” more autonomy, he said, “we would still go through all the same processes.”

Admittedly, that’s “a lengthy process,” Perkins said. But it should be possible to translate it for the robotic age. In essence, highly trained and highly ethical humans need to look at the war zone, look at the best intelligence on where civilians, combatants, and friendlies are likely to be, consider how powerful different weapons are and how likely they are to miss, and then come up with rules of engagement.

Rules of Engagement are essentially sets of if-then criteria, much like a computer algorithm. If we can train human soldiers when to fire and when to hold back, perhaps we can program a computer to do the same. Algorithms may make mistakes no human ever would — the “boy with baseball bat” problem —  but their judgment will never be clouded by rage, terror, or sleep deprivation, so they may avoid mistakes humans make all the time.

UPDATE Pentagon spokesman Mark Wright hastened to emphasize Kendall had no desire to cry havoc and let slip the robot dogs of war. “Under Secretary Kendall never suggested that the department should or would change our policies or values,” Wright told me in an email. “He discussed the extreme care that the US takes to minimize civilian casualties, and was speaking to the fact that this might work to our disadvantage as our adversaries in the future could use automation to increase the lethality of their weaponry without the same regard to safeguarding innocent lives that we have. This is certainly something we should be aware of as we move forward.” UPDATE ENDS

Luc Dunn (AUSA)

Gen. David Perkins

Indeed, there is tremendous potential in the robot revolution to save lives as well as take them. Kendall doesn’t want us to pass that potential by. “We still send human beings carrying rifles down trails to find the enemy. We still do that. Why? Why?” Kendall asked. “We still do movements to contact… with armored vehicles which are increasingly vulnerable.”

“I don’t think we have to do that anymore, but it is an enormous change of mindset” to embrace the possibilities, Kendall said. “I don’t think you can stop this. Autonomy is coming, it’s coming at an exponential rate.”

What do you think?