Josh Marcuse, executive director of the Defense Innovation Board (DIB), said today at Defense One’s Tech Summit that both Silicon Valley engineers and national defense community members are stuck in a line of thinking about artificial intelligence (AI) and warfighting innovation that could impact safety and contribute to greater risk on the battlefield.
Marcuse said Silicon Valley engineers need to embrace collaboration with the Defense Department (DoD), rather than shying away from assisting in the development of military-purposed AI. His comments came in response to former Deputy Secretary of Defense Robert Work’s assertion that Google’s decision to pull out of DoD’s Project Maven could further endanger military personnel and civilians.
“I believe the engineers who work at these companies who are taking issue with our approach should be active participants in making sure that ethics and safety is at the forefront of what we do, as it has been with every other weapon system,” Marcuse said.
He said that Americans applied humanitarian law to development of military technology in engagements that predate cyber conflicts and will continue to do so. He also said that Silicon Valley’s help is especially necessary in a time when adversaries do not respect intrinsic human values in the way the United States and its allies do.
“We, not just America, but joined with our allies, with democracies, who respect human rights, who respect privacy, who respect civil liberty, we are going to have to defend these democracies against adversaries or competitors who see the world very differently,” Marcuse said. “I think that we really need to get beyond looking at this question of ‘How will this one weapon system be used?’ and ask ourselves, ‘How are we that value capitalism, democracy, and civil liberty going to use artificial intelligence to compete with those who do not value those things?’”
DIB–which Marcuse directs–provides the Secretary of Defense independent counsel on technological innovation and includes board members such as famed astrophysicist Neil deGrasse Tyson and DIB Chairman Eric Schmidt, former executive chairman of Google.
Marcuse said innovators like those at Google that care about safety ought to likewise care about how DoD works to preserve it.
“We want to be safe practitioners of these tools. And that requires working with us, and we want to work with these companies and these engineers,” he said.
Beyond private sector collaborators, Marcuse was also quick to call out the innovation culture within the defense community and noted that advances in AI by cyber warfare rivals could leave military personnel ill-equipped.
“They will use AI, and they will fight at AI speed, and I don’t want to show up with a dumb weapon on a smart battlefield,” he said.
Marcuse suggested that DoD ought to better prioritize two things–speed and risk–to respond to the changing nature of warfare.
“We’re seeing product release at a rate in the commercial world that is multiple [times faster than] the amount of time it takes the Department of Defense to write a requirement,” he said of the speed of innovation adoption in DoD. “If we don’t figure out how to get faster, in not only hypersonic weapons, but faster in decision speed, we’re never going to get to innovation, and we’re never going to get on the same rhythm and tempo as our Silicon Valley counterparts.”
He added that DoD research has actually “given the world much of the foundational technology that all of Silicon Valley rests on,” and that innovation itself isn’t the problem. “What we have today that we used to not have in the past is an innovation adoption problem,” he said.
Regarding risk, Marcuse said that DoD decisionmakers in Washington are abdicating responsibility and shifting the repercussions of their decision-making to the battlefield.
“We are not reducing risk, we are just transferring risk. We’re displacing risk,” he said. “There is a huge risk of inaction. There is a huge risk to the warfighter of going into battle without the right technology. And what we’ve created is a way of shipping risk from the cubicle to the foxhole.”
Marcuse argued that fully considering the ramifications of inaction would impact how we decide on and ultimately deploy technology.
“We are making it easier for people here in Washington to protect themselves from being asked a tough question on the Hill or having their budget cut,” he said. “But what are we asking men and women to do in harm’s way? We’re asking them to do something much harder. So, if we viewed risk correctly in terms of the risk we put on people by not giving them the technology they need, that would completely change the way we make decisions.”