The video above is from the Army news channel “Live Fire” Benning Report on the Boston Dynamics “Big Dog” robot. Host Amy Gunn of the Fort Benning Public Affairs Office breathlessly tells us that the robotics revolution is here:
Since 9/11 the army has effectively employed thousands of flying robots like the Raven [link] and the Predator [link] and ground-based robots like the Talon [link] and the PacBot [link].
These modern marvels provide our troops with mores situational awareness, greater lethality and safety on the battlefield.
Now I’m as excited about robots as the next guy, but what are some of the downsides of the greater lethality of those modern marvels?
For that I went back to a January 22 Fresh Air interview of P. W. Singer. His book, Wired For War, explores the advance of robotics in warfare and the ethical dilemmas related to it.
Some excerpts from that interview…
On using them here at home:
[The] Homeland Security Department saw what they were doing with these Predator drones minding, for example, the border between Afghanistan and Pakistan, and said, hey, why don’t we use that here? Now, the original rationale for it was supposedly against counterterrorism, but so far it’s mainly been used against a different kind of homeland security threat, as some people would put it, where it’s being used to track down illegal immigrants at border crossings, drug dealers and the like. One of the Predators has been involved in over 30 different drug busts, capturing, you know, several thousand pounds of marijuana, for example.
On the impact of war robots on human psychology:
[W]hat we found is that there’s actually a whole lot of human psychology to the impact of robots on war. And one, for example, is the experience of the soldiers who are truly at war but not physically at war. That is, when we say, go to war, we’ve got a new twist on that meaning. I term it cubicle warriors. That is, these are folks who are working in office cubicles or something like that, but they’re juggling the psychological consequences of being at war but at home at the same time.
There’s a great quote from a Predator pilot who I interviewed, and he said it this way: You’re going to war for 12 hours, shooting weapons at targets, directing kills on enemy combatants. And then you get in the car and you drive home, and within 20 minutes, you are sitting at the dinner table talking to your kids about their homework.
And that’s one of the things that coming out of this is that we’re actually finding that the drone pilots, because of this tough psychological juggling they’re having to do, the drone pilots actually have higher levels of PTSD – Post-Traumatic Stress Disorder – than those who are actually physically serving in the combat zone.
War porn:
[O]ne of the things we’re finding is the rise of – I call it YouTube War. That is, the Iraq War, because of all these systems, is the first one where you can watch but you don’t have to be there. And these machines see all. And we’re taking these clips and watching from afar, but we’re also emailing them around.
We found over 7,000 different clips of combat footage in Iraq, and the soldiers actually call them war porn. And the worry of it is that it connects people to war. They get to see what’s happening, but it actually widens the gaps, that is, it creates a further distance. They watch more but they experience less.
We’ll always want humans in the loop:
[T]he reality is, humans have already been moving out of the loop well before we got to robotics. We’ve just kept redefining the loop. That is, a system like the Aegis, which is the air defense system on U.S. Navy ships, it already has a series of modes in which the system can take over, for example, if the humans are killed. Even when the humans are involved, the system is working so fast the human’s power in it is one of veto power. That is, they can only shut it down. And when we look at incident after incident, they’re afraid to shut it down because they trust the machine’s judgment more than themselves.
This is, for example, what a B52 navigator described of his experience of bombing Iraq. Quote, “The navigation computer opened the bomb doors and dropped the weapons into the dark.” That was what it felt like to him to drop a bomb. That’s where we’re already at right now without robotics moving into greater autonomy. But the next part of it is that there are all sorts of demands, whether we want to openly talk about them or not, that are leading us to building in more and more autonomy into our systems. For example, if you always have just one human in control, you don’t get any personnel savings from it. And that’s why, for example, the army’s doing research on how to have multiple robots, as many as 10 at a time, controlled by one human. Well, that means you’re giving more autonomy.
Then, of course, it’s war. The enemy gets a vote. So, they go, well what if the enemy cuts the line of communication? Well, we need the robot to be able to do certain parts of the mission on its own. So you know, another little slippery slope gets, you know – we go a little bit further down the slippery slope there.
Listen to the entire interview. Scary stuff. And, yes, very freaky.