'When are the metal ones coming for me?'
14th Sep 2009 | 11:05
Wired for War author PW Singer talks about robots in war
How warzone robots will evolve
Could robots really take over the world? Peter W Singer is an American political scientist who has written a book called Wired for War: The Robotics Revolution in the 21st Century.
Now out in the UK, the book explores the massively increasing role of robots on battlefields.
The book explores the questions that unmanned systems present for everything from when wars begin and end to warriors' very experiences – and the potential threat from AI to humans.
"You can't write a book about robots and war without have to deal with the 'when are the metal ones coming for me?' question," says Singer, who was also coordinator of defence policy for Barrack Obama's Presidential campaign.
But isn't it tempting to think of military robotics as being something of the future rather than the present? "Exactly! That is the point of the book, to capture this immense revolution around us that we aren't much noticing. Look at the raw numbers. When US forces went into Iraq in 2003, they had zero robotic units on the ground. By the end of 2004, the number was up to 150. Today it is over 12,000."
With the increase in drone strikes into Pakistan, as well as the expanded purchasing and use of unmanned systems in the UK's new defence plan, Singer believes that the subject matter of the book is becoming even more timely than ever.
"Already in the prototype stage are varieties of unmanned weapons and exotic technologies, from automated machine guns and robotic doctors to tiny but lethal robots the size of insects, which often look like they are straight out of the wildest science fiction," explains Singer.
For his book, Singer interviewed hundreds of robot scientists, science fiction writers, soldiers, insurgents, politicians, lawyers, journalists, and human rights activists from around the world.
Singer did meet some people he refers to as "refuseniks," such as scientist Illah Nourbakhsh. "Illah is one of the most fascinating people I met in the journey. They are robotic scientists who look at what happened to the nuclear physicists behind the Manhattan Project that built the atomic bomb and then regretted it for the rest of their lives.
- Book extract: The refuseniks – roboticists who just say no
"The Refuseniks like Illah don't want the same thing to happen to them and instead want to push a debate about ethics and robotics, including the ethics of the scientists who build them. So, despite the huge amount of money being offered, he refuses to take Pentagon funding and instead builds robots that he believes will make the best possible contribution to society."
Moore's Law in robotics
So how will robotic technology develop? Singer believes the two trends to keep an eye on are Moore's Law and miniaturisation.
"The multiplying effect of Moore's Law, year after year, is the reason that refrigerator magnets which play Christmas jingles now have more computing power than the entire Royal Air Force did back in 1959.
"If Moore's Law holds true, then within 25 years, this doubling effect will have robots running on computers that are a billion times more powerful than those today.
"To be clear, I don't mean "billion" is the sort of amorphous way that people throw about the term, but literally multiplying the power of an iPhone or Predator drone by 1,000,000,000.
"Now some argue that Moore's law won't hold and it will slow down. That may be true. But let's say the pace of advancement only goes one percent as fast as it has for the last few decades. Then our robots will be guided by computers a mere 1,000,000 times more powerful than today," muses Singer.
And what about miniaturisation, how will that evolve? "I recall seeing at one Air Force lab a tiny rocket engine that fit on the tip of a pen. Imagine the capabilities that can provide in war! A commando will literally fly a 'fly on the wall' in from over 1000 meters away.
"But what the book is about is also how we have to weigh the dilemmas that very same technology will bring. For instance, that the very same technology will be available to terrorists, corporations, criminals, and even my neighbours."
So Singer believes we have to better understand what is happening now if we want to handle it intelligently. "Robotics and AI is no longer just science fiction, but becoming technologic, as well as political reality. Or as one US military-funded robotics researcher put it to me, many may want to "think that the technology is so far in the future that we'll all be dead [and so don't have to talk about it]. But to think that way is to be brain dead now."
When should we salute our metal masters?
When should we salute our metal masters?
Let's not beat around the bush. The future vision described in the book is pretty frightening, isn't it? "Yes, it's pretty darn scary. But then again, the very first line of Wired for War is 'Because robots are cool.' That is my answer as to why someone writes a book about robots… [It's] also written in a way that isn't mean to scare but approach this important topic with a sense of both excitement as well as foreboding."
So how real does Singer think the threat from AI is in terms of having evil intent given the reaction of those he spoke to on this topic in the book? "Perhaps you should rephrase the question as, 'So when should we salute our metal masters?,' he jokes.
"Look, you can't write a book about robots and war without have to deal with the 'When are the metal ones coming for me?' question. Essentially, four conditions would have to be met. First, the machines would have to have some sort of survival instinct or will to power.
"In the Terminator movies, for instance, Skynet decides to launch a nuclear holocaust against humans in a bizarre form of self-defence after their scared attempts to take it offline when it reaches sentience.
"Second, the machines would have to be more intelligent than humans, but have no positive human qualities (such as empathy or ethics)," he continues.
"The third is that the machines would have to be independent, able to fuel, repair, and reproduce themselves without human help. And, fourth, humans would have to have no useful fail-safes or control interface into the machines' decision-making. We would have to have lost any ability to override, intervene, or even shape the machines' actions."
Could it happen?
Singer believes these conditions would present too high a barrier in the short term. "For example, most of the focus in military robotics today is to use technology as a substitute for human risk and loss," Singer explains. "It is the very opposite goal of giving them any survival instinct.
"Second, the ability of machines to reach human level intelligence may be likely someday, even sooner than most expect given the rapid doubling effect of Moore's Law on our technology just under every two years. But it is not certain.
"Third, while our real-world robotics are becoming incredibly capable, they all still require humans to run, support, and power them." Singer reaches for an example – the Global Hawk drone, the replacement for the manned U-2 spyplane.
"It can take off on its own from New York, fly 3,000 miles on its own to London, stay in the air 24 hours, using its surveillance and intelligence gathering systems to hunt for a terrorist over the entire city, then fly back 3,000 miles on its own to New York, and land on its own. But, the drone still needs humans on the ground to gas and repair it.
"Fourth, there are enough people spun up about the fears of a robot takeover that the idea that no one would try to build in any fail-safes is a bit of a stretch. Most importantly, perhaps, the whole idea of a machine takeover rests on a massive assumption.
"As many roboticists joke, just when the robots are poised to take over humanity, their Microsoft software programs will likely freeze up and crash.
"The counter to all of this, of course, is that eventually a super-intelligent machine would figure out a way around each of these barriers. In the Terminator storyline, for example, the Skynet computer is able to trick, manipulate, or blackmail humans into doing the sorts of things it needed (for example, emailing false commands to military units or putting humans in concentration camps), as well as rewrite its own software (something happening today with evolutionary software)."
Singer says the idea that we would not learn our lessons from science fiction is somewhat voided by the fact that real world military expediency has us carrying out research into all sorts of systems that science fiction directly warns us about.
"This is nothing new. HG Wells' warning of what he called an 'atomic bomb' in the anti-war story World Set Free instead served as the inspiration for the Manhattan Project. As I talk about in my book, one robotics firm was actually asked a few years ago by the military if they could design a robot that looked like the 'Hunter-Killer robot of Terminator.'
"It wasn't such a silly request. The design would actually be quite useful for the sort of fights we face now in Iraq and Afghanistan," Singer concludes.
Buy Wired for War from Amazon UK.
Liked this? Then check out an extract: The refuseniks – roboticists who just say no
Sign up for TechRadar's free Weird Week in Tech newsletter
Get the oddest tech stories of the week, plus the most popular news and reviews delivered straight to your inbox. Sign up at http://www.techradar.com/register