His most recent position was Special Assistant to the Under Secretary of Defense for Policy. This is an issue that brings together a whole array of disciplines: technology, military operations, law, ethics, and other things. As children get older, we grant them more autonomy — to stay out later, to drive a car, to go off to college. And I think that was right. Russia and China obviously come up a lot in the book, but experts seem to be more worried about non-state actors. Because no one involved in this debate, even those arguing that autonomous weapons will definitely save lives, thinks there are no risks involved.
One thing I think your book does really well is help define the terms of this debate, distinguishing between different types of autonomy. The big difference this time around is the lack of direct humanitarian threat. Is that a good thing? But collectively, that makes war less controllable and is overall to the detriment of humanity. But I think attacks like that will scale up in sophistication and size over time because the technology is so widely available. But autonomy and intelligence are not the same thing.
T he Air Force Flight Plan says in a situation where computers can make decisions faster than humans, it might be advantageous to hand over control to machines. And so this is a place where having a robust discussion is helpful and much needed. No, they expanded their firepower, and in doing so, they took violence to a new level. People were being killed and maimed by landmines and cluster munitions, while here, the threat is very theoretical. What I think is important is establishing the underlying principles for what control of autonomous weapons looks like.
That resulted in the treaties on , for example. Paul Scharre is a Senior Fellow and Director of the Technology and National Security Program at the Center for a New American Security. I felt like I had enough to say that I wanted to write a book about it. In the case of the United States, they sort of stumbled into this military robotic revolution through Iraq and Afghanistan. The issue is certainly heating up, particularly as we see autonomous technologies develop in other spaces, like self-driving cars.
Did they reduce the number of people in their armies? Many technologies certainly look great when you are the one that has them. This interview has been condensed and lightly edited for clarity. I think of it as posing the question: if we had any and all technology we could think of, what role would we want humans to play in war? More importantly, he pays as much attention to the political dimension of autonomous weapons as the underlying technology, looking at things like historical attempts at arms control e. Of course, this turned out to not be the case. He thought that a gun that fired automatically would mean fewer soldiers on the battlefield and therefore fewer deaths. People see a car with autonomy, and they make the connection between that and weapons.
I put down 10,000 words in the book talking about this problem, and now I have to sum it up in a paragraph or two. I think that more conversations about the topic by academics in the public sphere are all for the good. Our readers are familiar with the development of tech like self-driving vehicles by private companies, but how and when did the A rmy get interested in this? Ultimately, the challenge is not really autonomy or technology itself, but ourselves. With that in mind, are there any particular concept s here that you think are regularly misunderstood? There are only a few fully autonomous weapon systems deployed around the world, including the Aegis combat system pictured and the Israeli Harpy drone. Do you think it is inevitable that new technologies in warfare will have these unintended, bloody consequences? I think that is one of the central questions of the book.
. Scharre led the DoD working group that drafted DoD Directive 3000. Scharre is a term member of the Council on Foreign Relations. Scharre was involved in the drafting of policy guidance in the 2012 Defense Strategic Guidance, 2010 Quadrennial Defense Review, and Secretary-level planning guidance. Do you think such a treaty is likely to happen? What was your motivation for writing it? Given this fact — that autonomous weapons systems are being built in response to autonomous weapon systems — do you think the forward march of this technology is unstoppable? When tracing the history of autonomous weapons, you start with the American Civil War and the inventor of the Gatling gun, Richard Gatling. And automation there did reduce the number of people needed to deliver a certain amount of firepower: four people with a Gatling gun could deliver as much firepower as 100 people. With the Gatling gun, it was one of those fascinating things I stumbled across while researching the history of this field.
The doc says we can envision this period of time where the speed advantages make it best to go to full autonomy, and this raises all these tricky ethical and legal questions, and we need to start talking about it. I like that, and I want to see more of that conversation internationally. What decisions require uniquely human judgment? This seems incredibly important because how can we discuss these issues without a common language? As a former Army Ranger and someone who has helped write government policy on autonomous weapons, Scharre is knowledgeable and concise. They point out that a lot of this technology, like autonomous navigation and small drones, are freely available. You need to talk about autonomy in what respect: what task are you talking about automating? But the question is, what did militaries do with that? There are lots of reasons to think not.