Pentagon may win race over self-driving vehicles

Chris Walker / Chicago Tribune

The Pentagon may soon be competing with Tesla and its Model S P90D on self-driving vehicles.

Forget Uber, Waymo and Tesla: The next big name in self-driving vehicles could be the Pentagon.

“We’re going to have self-driving vehicles in theater for the Army before we’ll have self-driving cars on the streets,” Michael Griffin, the undersecretary of defense for research and engineering, told lawmakers this month at a hearing on Capitol Hill. “But the core technologies will be the same.”

The stakes for the military are high. Fifty-two percent of casualties in combat zones can been attributed to military personnel delivering food, fuel and other logistics, Griffin said. Removing people from that equation with systems run on artificial intelligence could reduce injuries and deaths significantly.

“You’re in a very vulnerable position when you’re doing that kind of activity,” Griffin said. “If that can be done by an automated unmanned vehicle with a relatively simple AI driving algorithm where I don’t have to worry about pedestrians and road signs and all of that, why wouldn’t I do that?”

The Pentagon has a long history of support that helped to develop or refine key technologies that become widespread later, including space flight and the internet.

With an annual budget of almost $700 billion, the Pentagon can afford to aggressively pursue autonomous vehicle technology well beyond fuel and food delivery trucks. The Army, for instance, is pushing forward with efforts to develop unmanned tanks and smarter vehicles for bomb disarmament, though many of those technologies will be remote-controlled, not autonomous.

Maj. Alan L. Stephens, an officer at the Mounted Requirements Division of the U.S. Army Maneuver Center of Excellence in Georgia, said in December that the Army wants to start testing light, fast remote-controlled tanks with the same firepower as the current 70-plus-ton manned M1 Abrams tank within the next five years.

Among critics’ concerns is the potential development of autonomous weapons that make their own life-and-death targeting decisions. Ash Carter, who was defense secretary under President Barack Obama, told a Silicon Valley audience in 2016 that “in the matter of the use of lethal force, there will always be — at least speaking for the United States — a human being involved in decision-making.”

Charles Dunlap, a retired Air Force major general who’s now a law professor at Duke University, said companies will come under “enormous pressure” as they sort out these issues and try to make sure their artificial intelligence products for the military don’t put people in danger.

“Self-driving vehicles for battlefield logistics resupply are obviously more benign than autonomous weaponry,” Dunlap said in an email. “But it would still be legally and ethically necessary to demonstrate that they can be used without excessive risk to civilians who may be caught up in the fighting.”