The Robotics Revolution and Conflict in the 21st Century. Featuring speaker, Peter W. Singer.
Wired for War
THE ASPEN INSTITUTE
ASPEN IDEAS FESTIVAL 2012
WIRED FOR WAR:
THE ROBOTICS REVOLUTION AND CONFLICT IN THE 21ST CENTURY
Monday, July 2, 2012
LIST OF PARTICIPANTS
PETER W. SINGER
Director of the 21st Century Defense Initiative
Senior fellow in foreign policy at
The Brookings Institution.
* * * * *P R O C E E D I N G S
MR. SINGER: (In progress) -- particularly at the Aspen Ideas Festival, it has to be the only place in the world where you can be standing in line with Barbra Streisand for cupcakes and end up talking about cyber security. So it's real neat.
So what I would like to do is talk to you about robotics in war and for this I would like to pull back and tell you a little story. About a year-and-a-half ago, the robotics trade group, the Association for Unmanned Vehicle Systems International, held a meeting. Now this group, the robotics trade group, had come a long way. It was formed in 1972 by a couple of Air Force officers and defense contractors in Dayton, Ohio, who've become enamored of the idea of flying robotic planes; drones.
They held their first meeting in a small conference room in a motel. Now since then, since 1972, that robotics trade group has grown to over 1,500 member companies in 55 different countries. And its conferences now fill entire convention centers not small rooms at the Holiday Inn.
In fact if you are not doing anything in August and you want to go do a less hospitable place in summer time you can visit their national convention, it will be held at the Mandalay Bay Casino in Vegas. Now, it was because of this headlong growth that the group was engaged in a little bit of institutional soul searching.
They were trying to figure out what exactly is the story that they should be telling their own members about what they do and what is the story they should be telling the rest of the world about what they do.
And they took this idea of narrative so seriously that the moderator for the conference wasn't a CEO, wasn't a scientist, wasn't a journalist like we have at the Ideas Festival here, they actually hired a professional storyteller. That is someone who could help them pull together decades of technologic and political developments into a single coherent narrative.
As one of them put it, what we have to figure out is, "Where have we come from, where are we now, and where should we and where do we want to go next?" And I thought those questions as a really neat way of focusing in on what is going on with robotics in war today.
And so what I would like to do is walk you through that notion of story and what I think are some of the answers to those questions.
Now, I am not one for PowerPoint and I have actually banned it from our own think tank but we have got several folks in here from Washington D.C. and I can see that they are going through withdrawal symptoms by not having enough PowerPoint.
So what we are going to do is actually a different kind of PowerPoint, it's not going to be something that I speak to, it's actually just going to play of series of pictures and videos of robots in action in places like Afghanistan today, or robots already at the prototype stage. It's for a couple of reasons. One is to give you a visual part of the story every good story has visuals. The other part of it is frankly to prove that what we are talking about here isn't science fiction. This is all real. And then finally it's just force you to, you know, when you get tired of looking at me not to look down at your iPhone and type away but actually to look up you are going to miss something. But again I am not going to talk to it. It's in a series of pictures and images.
Okay, let's pull back on all of this. The book that did called Wired for War was about how there is something big going on in the story the history of both technology war but maybe even humanity itself. The U.S. military that went into Iraq in 2001 had a handful of drones; unmanned aerial systems, remotely piloted aircraft, whatever you want to call them; we had a handful in 2001, none of them armed.
We now have over 7,500 in the U.S. military inventory. The invasion force that went on the ground had zero unmanned ground vehicles. We now have more than 12,000 in the U.S. military inventory. Last year, the U.S. Air Force trained more unmanned systems operators than it trained manned fighter plane pilots and manned bomber plane pilots put together.
So there is something big going on here. But one of the things that we need to remember, when we are talking about these PackBots, or when we are talking about the Predator drone that you may have heard of, we are actually talking about the model T4, we are talking about the Wright brothers flyer when it comes to these technologies. We are at the horseless carriage stage of all this. Even in how we wrap our heads around what we call it, horseless carriage-unmanned systems, we can only think off what they are not rather than what they are and what they are becoming. Now, it's important though that that's where we are out right now. Peering forward, I remember speaking with a U.S. Air Force three-star general, and he said how it won't be thousands of robots in our next conflict but, "tens of thousands of robots."
And it is not going to be tens and thousands of these robots or even the prototype ones that you will soon see because there is a rule in technology; it doesn't stop, it's always advancing.
We have encapsulated this by something that we call Moore's Law, a lot of you maybe familiar with. The idea that we have been able to pack more and more computing power into our microchips, into our computers, into our robots such that they basically double in their power and capacity just about every 18 months. Now, I could give you a better illustration of Moore's Law and its impact on the military.
How many of you have ever received from say your kids or your spouse one of those Hallmark greeting cards that opens up and plays a little song. Just raise your hand. Okay. Everybody. If you've ever held one of those cards in your hands, you held more computing power than the entire U.S. Army had when my father served in it; in that one single card. And of course, technology though doesn't stop today, it continues to advance.
So if Moore's Law holds true over the next 25 years, which is the range that we are supposed to be doing our strategic planning for, if Moore's Law hold true over the next 25 years, the way it's held true for the last 40 years, our technology, our computers, our robots would be roughly a billion times more powerful than today. And I don't mean a billion in kind of the amorphous way we talk about it back in Washington, you know, big deal budgets, a billion dollars here, a billion dollars there. I mean, literally multiply their current power with a one and nine zeros behind it.
Now, Moore's Law is not a law of physics, it doesn't necessarily have to hold true. What if for example technology moves at a pace that's just one, one thousands that it has historically? That means our technology will be a mere million times more powerful than today.
The point that I am making here is the kind of things that we used to talk about only at science fiction conventions like Comic-Con or something that we are having to wrestle with in places like the Pentagon or in places like Aspen. What is it like to live, work and fight through a robotics revolution?
Now, I need to be very, very clear here when I say robotics revolution. I am not saying that the robots are going to revolt. I am not saying that you have to worry about the ex-govnator coming to your door or the Robopocalypse, which is a movie that Steven Spielberg is working on right now. I am just saying this. There are technologies in history that come along and they are game changers. These are technologies like the printing press, the steam engine, the atomic bomb and the important thing, what makes them game changers, is not just the capabilities they offer you but the important questions they force you to ask that you didn't imagine you would be asking a generation earlier.
And these questions are not just questions of what's possible that you didn't imagine was possible a generation earlier, but they are also questions of what's proper, issues of right and wrong that you didn't imagine you will be asking yourself a generation earlier.
The historic comparisons that people make to where we stand now in the robotics revolution, I think, illustrate this point. When I went around interviewing people I would get answers of where they thought we were comparatively. The scientists and the engineers, they tend to think that we are around 1910-1911 in the horseless carriage stage. In 1910-1911, Ford Motor Company selling just over 200 model T Fords a year. Within 10 years spurred on by the demands of war and changes in the economy, it's selling over a million of them a year.
But why is the horseless carriage important in history. Is it because most of us now have garages rather than stables? It's bigger impacts that come out of that and everything from society to commerce to war. It's things like for example the impact that they've had on our architecture, our cities; there was no such thing as suburbia before the car. Impact on issues of social relationships; teenagers could only court on their parent's front porch, the automobile gave them a new sense of freedom they could go and date. It's geopolitical impacts, geo-economic impacts.
There was group of desert nomads at the time who turn out to be lucky enough to live over what was considered a nuisance rather than a resource. This black sticky stuff that would come up from the earth and that of course helped move them into a very powerful geo-economic position. And of course that same technology is changing our planet itself; global climate change.
And the point here is also that this new technology creates new questions, new challenges. So for example, before horseless carriages you had no such thing as traffic laws. And when they came along they said well we need something new. And the very first traffic laws entailed that someone was supposed to walk in front of a horseless carriage with a flag to let people know that they were coming. And when they got to an intersection they were supposed to fire a flare into the sky to warn people they were about to turn. That made perfect sense in a world that moved only 4 or 5 miles per hour. It didn't make sense in a world revolutionized by horseless carriages.
Now, other people make different comparisons. Bill Gates, the founder of Microsoft, naturally makes a comparison to the computers. And he says where robotics now stands is roughly equivalent to where the computer was in 1980. And he even says that if he was a young man right now he would go into robotics rather than computers.
And think about that comparison. Back in 1980 a computer, big bulky device can only do a limited set of functions. Military is the main spender on the research of computers, the military is one of the dominant clients in the market place of computers. But very soon computers get smaller. But we also figure out new functions for them. And they proliferate to such an extent that we stop calling them computers anymore.
So, for example, I drove here in a car that has over a hundred computers in it. We don't call them computerized cars or, for example, in my hotel room there's this computer that we call a microwave oven that all of you have in your kitchen. And the point here is that same thing is starting to happen in robotics not just in their shrinking size and form and proliferation, but even how we're stopping calling them robotics now.
So, for example, if you've bought a new Ford, a new Toyota, a Volvo they come equipped with technologies like "parking assist" or "crash avoidance" which are very nice ways of saying, we stupid humans are not good at parallel parking and we don't always look in our blind spots so the robotic systems in the car are taking that over for us.
Now, of course, again, with computers, why are they important in history? Are they important because I don't have to memorize long division tables anymore? It's all of the changes that it's had on commerce, on relationships, billions of dollars can be made in the stock market in a nanosecond, billions of dollars can be lost in a nanosecond. Changing social relationships; I can become friends via social networking with someone in China that I've never met before. Of course, I might be concerned that my niece is becoming friends with someone that she's never met before. And of course, all the legal challenges that come out of that.
I for example do some consulting with the FBI and I pose a question to them, sort of, a mean question; but what would J. Edgar Hoover think of the crime of identity theft? But we also have, of course, new domains of conflict that comes out of that; cyber warfare that we've heard about at this Ideas Festival, fighting in a place that literally didn't exist a generation ago.
The last comparison that people make is to the atomic bomb. And this is from the cross between the scientists and the ethicists. And the atomic bomb around the research in nuclear physics around 1944, 1945. And they think about this in two ways. One is that if you are a young graduate student today in the sciences, particularly in engineering or computer science you are drawn towards the field of robotics and artificial intelligence because it's the cutting edge of the field. A lot like, if you were a young graduate student back in the 1940s and you wanted to work on what was the cutting edge, you were drawn towards nuclear physics, that's where the action was at, that's where the funding was at, that's where the impact was at.
But also those same people that were drawn into nuclear physics later worried about the fact that they had created a genie that they couldn't put back in the bottle. And people in the robotics field worry about those comparisons as well.
The underlying point that I'm making here is that in discussions of technology, we usually focus on the nuts and bolts of how the technology works. But, what really matters is all of the ripple effects that it has on our world, all of the tough issues and questions that come out of it.
And so for several years I went around the world, essentially interviewing anyone and everyone involved in the cross between robotics and war. What was it like to be a scientist working on these systems, what was it like to be a science fiction author and see your dreams become reality and lot of them were actually working with the Pentagon. What it's like for those in the military; everything from the young 19-year old flying a plane from Nevada that's actually over a rock, what's it like for their squadron commander, all the way up to the generals that command them; what about the politicians, what do they think about this; the media, how is it reporting the story, not just the American media but also media in places like Pakistan or Lebanon; and then finally the right and wrong of this? So interviews with everything from military lawyers to human rights activists.
And what I would like to do is sort of walk you through what I think are some of the more interesting stories or ripple effects that we've identified coming out from robotics. Now, one of the major ripple effects coming out of this just simply changes in the field itself in the market. We're seeing this in three ways; one is the size, shape, form of robotics is blossoming out. The first generation of these systems they looked a lot like the manned systems that they were replacing, even down to the planes literally having the cockpits painted over.
Now, as you're seeing from the pictures here they're coming in all sorts of sizes, shapes, forms mimicking nature, wings, the length the football fields too. I was in an Air Force lab, where I saw a system that could literally fit on top of a pencil.
The second thing that's changing though is their intelligence and their autonomy is growing. And this is really a game changer in war. And I'll give you a, sort of, a historic parallel. If you were back in World War II and you were comparing the famous B-17 bomber to the B-24 bomber, you would say, well the B-24 bomber is newer and it flies faster it flies further, it carries more bombs that's the difference between the two.
And so you could say the same thing about the MQ-1 Predator and its new replacement, the MQ-9 Reaper. The Reaper flies faster, it flies further, it carries more bombs. But there is a fundamental difference that we've never seen in the history of weapons and war before. The Reaper is smarter, it's more autonomous. It's not out there making its own decisions like the Terminator movies but it can do things like take off and land on its own, fly mission way points on its own. It has smart sensors that allow it detect a disruption in the dirt from a mile overhead and tell the humans that that disruption in the dirt is what you guys call a footprint and then backtrack where that footprint came from and say, here's the hideout that that insurgent was in. That's a big change when you pull back and think about weapons.
But the final change that's happening is the user base and functionality of these systems is just exploding. Originally to fly them, to operate them you had to be an expert. And I think about the comparisons here to computers. A lot of you will remember, you know, the first time you used computers -- when I first used computers you had to learn -- you had to learn this strange language. For me it was called BASIC, other people learned other languages; FORTRAN, DOS, et cetera. It was like a foreign language. My three-year-old son can operate his own iPad and find his favorite videos of garbage trucks whenever he wants and he doesn't even know how to type yet.
And it's not that the system is simpler, it's actually that it's simpler to use, its more complex underneath it. And that change is playing out in the user field for robotics. So to first have to fly these systems you had to be a trained pilot. Now, there is an iPhone app to fly them.
And so what that is leading to is a cross between imagination, innovation and profit seeking. And so we're seeing robotics going into all sorts of different areas outside war everything from police law enforcement usages, to nurses aides, to environmental survey, cargo, firefighting, disaster response you name it. And a big game changer that's looming inside the U.S. is 2015 where Congress is legislated basically that the FAA has to open up the air space to these systems.
And I was talking with the vice president of a leading robotics company. They make small drones that they sell to the U.S. military. And things are good for them. He said we have a great client, one client; the U.S. military. But looking forward they think when this legal change happens they'll have at least 21,000 new clients. And what they were doing is they were counting all the state and local law enforcement departments, police departments out there that either can't afford their own police helicopters or have them and they're really expensive and their small drone systems they think are going to be competitive against them. Basically, the flight hour comparison for a police helicopter is as much as $1,000 an hour compared to under $100 an hour for the drone. And they think just from a budget standpoint, a lot of police departments are going to look at it that way.
But this of course, leads to more ripple effects. When you open up new markets and customers people using them for everything from war to firefighting to police to -- in Australia they use them to help hunt great white sharks. It leads to another series of questions. While this is a robotic revolution as you've seen from these pictures here it's not just an American revolution. There's a rule in technology and in war there's no such thing as a permanent first mover advantage.
So, quick show of hands here. How many people in this room have ever used a Commodore computer? How many of you still use your Commodore computer? Same phenomena happens in war. The British were the first ones to use the tank. They actually got the idea from a H.G. Wells short story called Land Ironclads. Winston Churchill reads it, thinks that would be great for us to use and in that trench warfare in World War I, but we can't call them land ironclads because that will give the secret out, that will be too easy to figure out what we're building. So we'll call them water tank carriers instead. And of course, if you know your history the British may have been the first to use the tank, the Germans though, by the time World War II rolls around, are the ones who figured out how to use the tank better.
And so the challenge for us is that the same thing is happening in robotics. The U.S. is definitely a leading player particularly in military robotics and we should be. We spend about $0.48 of every dollar spent on the military out there in the world. However, there are over 50 other countries also now building, buying and using military robotics. And they range from countries like Great Britain, France, Israel, Germany, Pakistan, Iran, Russia, China you name it.
China, for example, has gone in the last 5 years from having no unmanned aerial systems to at their last military trade show displaying 25 different models. And they're spending on it, at least as estimated by the Frost & Sullivan consulting company, in the next 10 years is going to go up not 10 percent, not 100 percent but see 1,000 percent growth.
Now, one of the things that comes out of that is another question, another ripple effect. The strange cross between national security and intellectual property rights. That is if this is a technology that's comparable to the rise of the automobile, the rise of computers, the atomic bomb and nuclear physics we're likely to see attempts to steal secrets in it for both economic but also political reasons. And we're seeing the same thing play out here.
One of my favorite examples was talking to a sales guy for a ground robotics maker. And he was in charge of the East Asian market. And he went out to a trade show in the Pacific and he was looking at the Malaysian army's display and they had one of his robots there and he got really angry. And he calls back the home office and he says, "I'm the guy in charge of the East Asian market, who's been selling in my turf?" And they said, "We've never sold to them." It was a Chinese knockoff of their ground robot.
And we're seeing this play out in a meta level in cyber security issues. I was visiting a major defense contractor 2 years ago and they knew of five different advanced persistent threat campaigns, that is, five different organized, sort of Ocean's Eleven like equivalence of people trying to steal secrets from them. I went a back a couple of months ago and they're aware of 32 campaigns against them. That's the groups they're aware of. So the growth is huge.
The final example of this concern played out a couple years ago, where it turned out that insurgents in Iraq were listening in, watching the video feed from our Predator systems. They weren't able to control the systems it was more like a criminal, a bank robber tapping into the police radio network, which is why the police encrypt it. The unfortunate problem is that we're often our own worst enemies in this technology. The reason why the insurgents were able to pull it off was three things coming together.
The first is no one had planned to be going from zero to 7,000 of these systems. And so the communications network was cobbled together. And security was often set by the wayside. The second was no one factored in how fast technology could change. They were aware of the vulnerability in the comms network as far as back as 1999. The difference was in '99 to tap into the network you needed a huge device and it was highly complex. By 2009 when this played out you needed a piece of $29 software that college kids had invented to illegally download movies from the Internet. And they were buying -- the insurgents were getting it off of a Russian website.
And, finally, arrogance. As one of the U.S. Air Force officers in charge of the program put it. We knew about the potential security breach but, "We didn't think anyone in the Middle East would be able to figure it out." Again, this was just listening in on the systems. It was not controlling the systems.
But as we move forward we move into a era of what I call battles of persuasion, where the goal is not merely to destroy the system, but potentially co-opt the system, take over the system, persuade it to do something that it's original owner wouldn't have wanted it to do.
Again, something not possible before in war whether you're talking about the idea, how could you ever persuade an arrow to change in mid-flight to you couldn't persuade a human pilot in a jet fighter but you can persuade a system recode, all American jets as Syrian jets et cetera.
But as we've heard about at the Ideas Festival, maybe there's just other things to be worrying about. Maybe it's just the question of where we're headed as a nation in our economy and education. So, what do you think the next 20 years will bode for America's manufacturing economy? Do you think it's going to get much better, do you think it's going to stay the same or do you think it's going to get worse? What about the next 20 years in America's science and mathematics education? Do you think we're going to get a lot better, do you think we're going to stay the same or you think we're going to get worse? However you answered those questions it's probably the best indicator of where we're going to head in this field on a global competitiveness state.
Another way of putting it is, what does it mean for U.S. security that the number of students who graduated with a degree in information technology or electronic engineering is slightly less than we graduated in 1986. But that we've had a 500 percent increase in recreation, leisure and fitness studies.
Now, another issue that comes out of this is open sourcing. We've focused in on the States. But this technology is one that multiple actors can tap into just like open source software. It's not just the big boys that control it anymore.
So, for example, I advise on the Call of Duty video game series. And for that we conceived of a armed quadcopter system. And it's -- it will be in the game coming out in a couple months, you know, be sure to buy it for your kids. But what was interesting is in the marketing for the game, we decided wouldn't it be neat if we could build a working version of that armed quadcopter. And we created a viral video that showed a working version of the armed quadcopter.
A Pentagon office saw the commercial and said, hold it, that video game company has figured out how to build a better system than every other American tactical robotic system today. Yeah, and I'm part of the defense industrial complex now in a strange way. And what I'm getting at here is that this technology it's not like an aircraft carrier -- and this is one way that it's not like an atomic bomb. It's not something that you require a huge defense industrial complex to utilize.
So, if you, for example, you know a group like Hezbollah could not build its own aircraft carrier. And even if you parked an aircraft carrier off Lebanon and said, "Here's the keys" they wouldn't be able to utilize it effectively. Hezbollah has though already operated robotic systems. In its war, a couple of years ago, with Israel even though it's not a state military it still flew drones against Israel and Israel flew drones against it.
And this goes into all sorts of other actors. For example, my favorite vote for, Innovator of the Year, last year wasn't someone from Apple or Google it was a group of jewelry thieves in Taiwan who using robotic -- tiny robotic helicopters and pinhole cameras carried out a $4 million jewelry heist.
Sounds like again a Hollywood movie but it's already been done. And so what the impact of this we're seeing the true empowerment of individuals and small groups compared to the power of the state. So in World War II Hitler's air force, the Luftwaffe could not strike the United States could not reach across the Atlantic. Three years ago 77 year-old blind man built his own drone that flew itself across the Atlantic, so he had greater reach than the entire Luftwaffe.
And of course this impacts into areas like potential terrorism. You don't have to be suicidal to play in this game. There was an individual in Boston, a couple of months ago, actually in October, who wanted to fly a plane into the Pentagon. His plan wasn't that he would hijack a plane, his plan was that he would get a drone load it up with explosives and then fly it into the Pentagon. He was able to get the drone. He made the fortunate mistake of asking an FBI informant, where do I get C4 explosives from? So we're in a world where a would-be terrorist found it easier to get the drone than the centuries old technology of explosives.
Now, there's other ripple effects that come out of this, big ones to worry about. One is our politics. And we were talking about this in another session and it's come up again and again at Aspen Ideas Festival. The idea of what is robotics doing to the relationship between the American public and it's wars. In my mind robotics takes certain trends that are happening in our body politic maybe to their final logical ending point.
Think about it this way when it comes to the linkages. We don't have a draft anymore. The last college graduating class to worry about being drafted just celebrated its 30th reunion. We don't declare war anymore. The last time the U.S. Congress actually declared war was 1942 against the minor Axis powers like Bulgaria and Hungary that we forgot to include in the original day after Pearl Harbor declaration of war, 70 years since we've actually declared war. We don't buy war bonds or pay war taxes anymore.
During World War II the American public bought $185 billion worth of war bonds. If you raised $2 million worth of war bonds you got to name your own ship. In the last 10 years of war we've bought zero war bonds and the richest four percent are either Aspen attendees who got tax breaks.
The point that I'm making here is that now we have those trends already in play and now we have a technology that takes out the last factor. The political consequences of sending people into harms way is literally hitting the ground. The barriers to where they were already lowering the technology takes them to the ground. And we're seeing this -- it's not just political theory I'm talking about here, we're seeing this, for example, in the not so covert war, as I call it, in Pakistan, where we have carried out 334 air strikes using unmanned systems into Pakistan over the last several years. That's roughly eight times the size of the course of a war 10 years ago in terms of targets that's hit.
And we didn't we have a vote on it, yes or no. It's also hitting overt operations. So, for example, in Libya, we had the reaction there to a potential massacre by a dictator of people in a city and so we deployed forces to stop that massacre. And then as the operation shifted to aiding a group of insurgents and regime change, we said, no, we're pulling back from this. We're not going to be in an active role the rest of NATO is going to take this over. And so when we got to the 60-day mark, where under the War Powers Resolution Congress is supposed to vote aye or nay or we pull out under the law. This is one of those post-Vietnam laws. The executive branch said to Congress, you don't have to do this. You don't have to vote aye or nay because, "US operations do not involve the presence of U.S. ground troops, U.S. casualties or a serious threat thereof." The operations after that mark however did involve something we used to think about as being in war; blowing things up, a lot of them. After that mark, we carried out 145 air strikes using unmanned systems and did most of the targeting for all of NATO's manned air strikes including all the way to the very last air strike that got Gaddafi.
Now, the point that I'm making here is that war and how we looked at it used to involve two things together; the kinetic side, the blowing things up side but also the side of sending people into harms way. That's what it meant to go to war. And the technology is allowing us to disentangle those two parts and our laws really haven't caught up to it.
And notice I'm not saying these are good or these are bad operations. I'm just saying something that we would have previously considered war; we, everything from the policy field, the media, to those of us in this room aren't treating as war anymore.
Now, the impact of this is also on areas like our own psychology. How is this hitting the war of ideas? That is, what is the story, what is the message that we think we are sending when we utilize these technologies versus what is the message that is being received on the other end of these technologies? So I wanted to find this out. So, for example, I remember doing an interview with a leading State Department official and he said, "Our manning of war plays to our strength. The thing that scares people is our technology." But of course, you have a flipside to that.
I was interviewing a leading newspaper editor in Lebanon and he described and actually during the interview a drone buzzed overhead. And he said, this is just -- this is his quote, "Just another sign of the cold-hearted cruel Israelis and Americans who are also cowards because they send out machines to fight us. They don't want to fight us like real men but they're afraid to fight." I don't think that's true. But that's his perception. And that perception matters. It matters in lots of different ways.
For example, the would-be Times Square bomber got into the game of terrorism, at least by his own admission, over his anger over the drone strikes that we were carrying out to try and stop terrorism. Big questions to figure out here. Essentially, our challenge is, how is a technology, how is a tactic fitting in to our overall strategy, our long-term endgame? How do we get out of a game of Whac-A-Mole when it comes to terrorist leaders?
But, of course, this is impacting other aspects such as the experience of going to war not in terms of the nation but for the individual. For 5,000 years the idea of going to war has meant going to a place of such danger that you might never see your family again. That's a shared experience of everything from the ancient Greeks going to fight Troy to my grandfather going to fight the Japanese in the Pacific in World War II. That idea of going to war, going to a place of such danger you might never see your family.
This is how a -- the first Predator squadron commander described what it was like to fight insurgents in Iraq and Afghanistan while never leaving Nevada, "You're going to war for 12 hours, shooting weapons at targets, directing kills on enemy combatants. And then you get in the car and you drive home. And within 20 minutes you're sitting at the dinner table talking to your kids about their homework." That is a fundamentally different experience of going to war.
And we need to be clear here. It's not an easy experience it's not a video game experience, it's not an antiseptic experience which are some the terminologies I've heard at panels at Aspen. It's a tough experience, everything from the commander's job to how it impacts the psychology of these operators. In fact, the levels of stress and burnout in some cases are as high as for ground units physically in Afghanistan.
But, another impact of this is the demographics of war. Who can do what in war? Perhaps my favorite story in the book is about this young 19-year-old man who was a high-school dropout. And his dad was really disappointed in him. So he wanted to make his dad proud of him again. So he volunteered to join the U.S. Army. The recruiting officer asked him what would you like to specialize in, son? He said, I'd like to be a helicopter mechanic that would be really cool. And the officer looked at his high school transcript and said, I'm sorry, son, you failed your English literature class which means you're not qualified under Army rules to be a helicopter mechanic. Would you like to be an unmanned aerial systems operator instead?
He turned out to be incredible at it. He turned out to be a natural, so to speak. It wasn't that he was a true natural, he spent his entire life training for just this kind of exercise. And he turned out to be so good that after his first deployment they promoted him, they made him a specialist. And do we have anyone here that's recently served in the army that can say -- or in the military about what a specialist gets paid?
SPEAKER: Not too much.
MR. SINGER: Not too much. A little over $20,000 year. This is not a big promotion. But besides that promotion they then made him an instructor in the pilot training academy. It's a really cool story from one perspective. Because of this technology this young man found himself, making his dad proud of him again serving his nation well.
I told this story at the Air Force Association National Convention in a speech just like this. They did not like the story. You have a teenaged high-school dropout enlisted man in the Army who's not just a pilot instructor right now but has helped take out more enemy targets than every single F-22 fighter jet pilot put together. They look at him with the same mix of, kind of, dismay and disdain and fear that the knights had when they looked at the peasants with guns. What are you going to do? Hold it, what are you going to do?
And, of course, this demographic change, who can do what in war, is not just happening outside but also it's changing ourselves. As you've seen here and that we've seen at a couple of the panels here at Aspen, we're starting to put technology into our bodies, on our bodies, and it's changing us.
So one of the more powerful stories to come out of this technology is that more than 400 American soldiers who've unfortunately lost arms or legs due to these IEDs, due to roadside bombs in places like Iraq or Afghanistan and had them replaced with robotic prosthetics that are so good that over 400 have gone back to serve in their combat units with a prosthetic arm or leg. The test to prove that they can do it is equivalent to the Ironman Triathlon.
But, of course, we don't stop at replacement. Much of the funding now is on what we call enhancement; getting bigger, better, faster, stronger. And that raises a whole new series of legal and political and ethical concerns to figure out. It's basically a lot of the issues that we'll hear talked about in the Olympics hitting warfare and regular life.
So much of what you've been hearing from me is that there's always two sides to a technologic revolution. Moore's Law is playing out but Murphy's Law isn't disappearing. We're getting science fiction like capabilities but science fiction like riddles to answer. And sometimes people think that these are simple.
I remember talking with a senior executive at a ground robotics maker, and he said, "No, no, these are just oops moments." When things don't work out with your robot it's just an oops moment. You do a product recall and you fix it. So what are examples of oops moments so far in robotics and war?
Sometimes they're kind of funny. Like when they were field testing a machine gun armed ground robot and it went, quote, "squirrelly." It started spinning in a circle and pointed its weapon system at the VIP stand of people there to watch it. They were very glad that there no bullets in the 50 caliber machine gun at the time. Other times these oops moments can be tragic. Like a couple of years ago in South Africa an automated 35 millimeter anti-aircraft cannon essentially had a, as the report put it, software glitch.
The software glitch -- and we've all suffered from those, for example, I fly a lot. A couple of years ago one software glitch in a computer in Atlanta shut down -- the computer at the FAA headquarters in Atlanta shutdown half the national air space. So in this case the software glitch caused the weapon system during a training exercise not to fire in the sky like it was supposed to but to level and fire in a circle. It killed nine soldiers before it ran out of ammunition.
And the point I'm making with these oops moments is not that they happen but imagine you're the young officer, the young lawyer asked to investigate it to figure out who to hold accountable. What system of laws would you turn to?
In war, our prevailing legal codes date from the same year that the 45 RPM vinyl record player was invented. And so it's a lot to ask of a legal code from that era to catch up to 21st century technology like this. I remember asking these questions at Human Rights Watch. And one of their lead officials really said, in fact, the prime directive from Star Trek would more useful to us today. That may be true. But we can't call Captain Kirk in a real court of law and this technology is real.
So I'm going to end here. Bottom line, it sounds like I've been talking about the future. But notice how every story I told you, every picture you saw is from the present reality of both technology and war and politics today. And so it poses a question to us, a lot like what people dealt with, with horseless carriages or atomic bombs. Are we going to let the fact that this looks like science fiction, feels like science fiction keep us in denial that this is our reality.
MR. SINGER: So we got some time left if you -- anyone who wants to ask a question just wait for the mic to come around and stand up and please introduce yourself. So, here we got the first hand up was right here.
SPEAKER: I'm just curious what the percentage of research and development that's going into robotics is military and which is private -- private sector?
MR. SINGER: It's a great question. And I've never seen an exact breakdown of it by percentage. And one of the reasons that it's really hard to figure that out is that the vast majority of basic and applied research in this space comes from the military, particularly, from agencies like DARPA and Office of Naval Research. But it's often for mixed areas.
And this really points to an interesting ethical question that a lot of the scientists are wrestling with. So, give you an illustration. This one scientist that I was talking about, he said, I've got nothing to do with war. I don't work on war robots this is not my area of concern. In fact what I'm building is a baseball playing robot. I said, okay, who's your funder, who's funding that research? He was in a university. He said, Office of Naval Research. And like, okay, why are they interested in that? It's not because they want, you know, Naval Academy baseball team to, you know, beat Army with robotic players. It's because the same concepts of a robot that can do things like react to the trajectory of an incoming ball or projectile and move to that place before it lands may have some application in war. Or a robot that can operate under a very complex system of rules and still carry out a set of functions may be useful in baseball, may be useful in war.
And so this is one of the real issues within the roboticist's field is, where do they come down on this question of DOD funding? And you have two -- you have sort of three groups. And to be frank, I have a huge amount of respect for two of them and not one. Some are the folks who say, "I'm very proud of what I do. My work is saving soldiers lives out there in Afghanistan or the like." And you saw the picture of like a postcard that one of the manufacturers displays is, this is what we're so proud of.
And then you have another group, the refuseniks. Basically, people who say I don't want to be part of this and I won't accept any DOD money even at the start of it. The problem is they're both around, you know if I would put percentages 5 to 10 percent.
And there's this vast group in the middle that kind of says, "Yeah, I'm working on it but I'm really not dealing in war." And I think part of it is also just frankly the -- and this is again what we've seen at this festival is that while we want to stovepipe issues in fields they quickly cross over. So one of the roboticists put it to me this way, he said "I can't really think about ethics because I don't own a philosopher's hat, so I'm not even going to put that hat on." The problem is then as -- you've already made an ethical decision even though you haven't thought about it that way.
Right there, in the white shirt and the glasses. Yeah.
SPEAKER: I agree with you very much. But it seems like you're on a like a 20-year timeframe. Could you maybe tell the audience a little bit about Drexler's work because when you jump from the robotics to the nanoscale that's going to, you know, bring up a lot more issues.
MR. SINGER: So Drexler -- there's two things we can bring into this. One is some of the game-changing work in nanotechnology and then the other is a bigger discussion about the singularity which is work fed into. So I'm working on a project that we call NeXTech. And the idea of it is to try and figure out, okay, this seemed really fascinating, this seemed game changing. But all of these technologies, you saw the pictures of them. What's the next, so what's the -- you know, the Predator drone, it first started flying back in the early '90s.
Actually the Predator was originally called Amber. It was fortunate for the company that they came up with a much better sounding name for war. You can imagine, you know, the military not being excited to buy Amber. Okay, so what's the equivalent to that, what are some of those things that are just starting out right now? These are fields that we think -- and we did interviews of everything from military lab directors to visionaries at places like Google or Apple to venture capitalists to people putting money into fields.
And some of the game changers that they thought were at that space were directed energy, artificial intelligence, all the work in genetics and bio and nanotechnology was the other. And the big shift in nano is where it moves. And nano is things operating things at a molecular level. Things -- the big change is as it shifts from nanotech to nanorobotics.
We already, a lot of us, have nanotech in our lives. For example, if you have ever worn a pair of Dockers, you know, stain shielded, don't wrinkle pants or if you've ever played tennis with a Dunlop tennis ball you've used nanotechnology. Robotics is not just tools like a hammer but robotic, that is, it's machinery operating at that level.
And we have seen the creation of nanomotors but not all the other parts of a nano actual robot. But people think that's looming. And the experts in the field get into fierce arguments about when they think that's actually going to happen. And they argue about it -- every thing from, it's going to happen in a couple of years from now to 5 to 10 years from now but they're very, very optimistic about that happening.
And as the questionnaire was putting it, that's a really true game changer. Because then when you can build at a nano level the first thing you do is build something else that can build at a nano level and then suddenly you're able to enter what science fiction folks have called the diamond age, where diamonds are worthless because essentially when you're operate at a molecular level it doesn't matter if you're building a diamond or a piece of chocolate you just need that carbon. So it's a big, big change. Of course, there's a lot of, you know, concerns about runaway nano, how do you keep it et cetera, et cetera.
What historically we hit is a moment that another individual that this work is fed into and a lot of you may have heard of this, the idea of the singularity. The idea that you hit this moment where you can't predict what happens next. Where you have technology that's so advanced that we're like the monks being shown a printing press and saying, hey monk, how is the world going to look like now that this thing exists. And the monk would say, well, you know, they might use it to better illustrate bibles, I don't know.
No one back then would be able to predict things like mass literacy, democracy, the Reformation, the 30-Years war, Sports Illustrated swimsuit issues, you name it, no one could predict all those things that would come out of that game changing technology like the printing press. And so they feel that we're at the cusp of this where people like you and I won't be able to predict what comes next.
My point is I think we're already at that moment without nanotechnology. I think we've already -- you know, we don't need sentient robots, we don't need microscopic level to see that it's already changing our wars, our politics what's possible and not possible in commerce the need for new laws et cetera.
The gent sitting here in the front. Yes, stand up and go ahead, yeah.
MR. McDONALD: Thank you. Zack McDonald (phonetic), former Army infantry sergeant. I am quite frankly very excited about drones anything that can save the lives of American soldiers and enhance their ability to kill the enemy I think is a positive thing. One concern I do have though is with the emergence of unmanned air -- unmanned vehicles either ground-based or aerial that are launched by individual infantry men rather than UAV specialists such as the upcoming Switchblade drone.
I think that these hold a lot of promise but the same time I remember an Army private launching a Raven, which is a sort of unmanned aerial vehicle straight into a Hesco barrier. And I'm concerned about individual soldiers who are not UAV specialists receiving the proper training to utilize these technologies?
MR. SINGER: It's a great question and it hits one of the big important issues within the military today, is every new -- when you get a game changing new technology it's not just what you're going to buy it's your doctrine. How you're going to use it, how you're going to train around it, how you're going to organize around it? And so, you know, we can pose this question -- I pose it sometimes to military audiences, do you think this is the same -- are manned systems and unmanned systems the same? If they're the same we don't have to change anything.
If they're different we need to change something but what do we change, how do we change our training, how do we change our equipment? Do we allow them to be autonomous or not, do we have them centralized control or do we distribute them across the force, do we have them operate in a hub and spokes model, a mother ship model or do we have them operate as swarms or wolf packs to make like a nature parallel? These are all questions to figure out.
And, you know, the -- why that's important is if we get this right, we figure out the 21st century version of the blitzkrieg. If we get this wrong we're setting the U.S. military on the path of the 21st century version of the Maginot Line. And to this -- the specific question you ask is, you know, there are literally not just hundreds but I would argue thousands of U.S. servicemen and women who are alive today because of this technology. That's why it's not going away. Those powerful proof stories are inarguable.
But as you put it, there's all these sort of questions of the best way to adopt to it. And the Switchblade is a great illustration. The Switchblade is a new concept cross with an old concept. It's a drone but a tiny drone. It's about the size of a rolled up magazine and you -- a small unit can carry it in their backpack rather than a big thing like a Predator and they can fire it and it takes off and flies. And then if they see a bad guy, like hold it, there's a bad guy on that building over there, we've detected him without having to rely on people hundreds of miles of way to let us know about it, we can then convert the Switchblade into a robotic kamikaze. And it will just fly into that bad guy and it will blow them up because it carries explosives
This is really powerful. What it does is it gives a small unit, enlisted men basically the equivalent of a small cruise missile. And that's why they loved the concept. That's exactly why senior officers hate the concept. Because they go hold it, does it -- you know, do -- are they going to be responsible enough with this system, do they have the right training with this system?
And that's a particular concern that you hear from the Air Force is it's not just -- being a pilot is not just about controlling the joystick even though that's what we've measured ourselves on. That's where our identity is from. What matters may be also all of that training an officer gets in Air War College or the like. And so there is a real back and forth on that, but the problem for those making the argument is that the aviator field has traditionally measured itself by their technical skill set not by these dilemmas.
You know, if you ever read the bio of a senior Air Force officer, it's how many flight hours is in their bio, and a senior Navy aviator, it's how many carrier deck landings. Whereas what matter -- what may matter more maybe that human thing that robots can't match, you know, the intellect and the heart to figure out, you know, what's right or wrong.
Just right here in the front.
SPEAKER: Can you talk a little bit about domestic deployment of drones and what they might be -- what the planned uses are? Are they armed?
MR. SINGER: So domestic uses of unmanned aerial system, it's taken off in a huge amount in the news recently. And, you know, I've been part of that discussion, but the problem is there has been a lot of myth and hysteria layered on top of it -- on top of really, really deep concerns. So the current rules today are that if you want to operate a robotic system in the air you can as long as it doesn't go above 400 feet and you don't operate it near, you know, an airport or the like.
If you want to be able to operate it above 400 feet, it has to be something that you get a special license for and you also can't use it for commercial reasons. And despite that we've seen a blossoming of different roles sometimes for commercial reasons. And so we've seen everything of these smaller scale systems, you know, from do-it-yourself builders and there is more than 10,000 people, you know, using these homemade quadcopter kits to -- people using them for commercial reasons
Everything from -- LA Times had a story about a photographer for real estate agents, you know, so you look online, you're about to buy -- you're about to buy a house in LA or in Aspen and you get the pictures and the rooms. But wouldn't it be neat if you could get an overhead picture of the entire landscape or you could get the picture of what it's like to walk through that room. So why don't we instead fly a little quadcopter through the room. And so there was a story about that usage.
To invite news organizations using it to -- after there was a big flood in North Dakota, let's fly our own camcopter over it and figure out, you know, get picture of that to people using it for disaster response after Hurricane Katrina. Lots of different usages of this.
There is actually a marine system that you saw there that have started out being used by tuna fishermen to find where the tuna were. And the Marines said, well, that's useful why don't we use it, and so it's in their main system in Afghanistan today.
The change though is above the 400 feet and you need a special license for it right now. The special licenses have only gone to a limited set. A couple of police departments like Miami-Dade to this little (inaudible) police department in Mississippi to environmental survey operations in Alaska. But you know we're talking under a hundred special licenses.
In 2015 though, Congress has told the FAA -- let me be clear -- Congress told it without any debate about this part of it, it was part of the FAA budget authorization. The people in Congress concerned with privacy had no clue about it. They basically said, FAA figure out how to make it happen by 2015 to open up the overall airspace. And that's the game changer because then suddenly people can operate it at lots of different levels for a lots of different usages.
That of course opens up lots of different privacy concerns, be it in police hands -- I remember speaking with a federal district court judge who said, this will be a Supreme Court case because it applies to areas like probable cause where flying overhead you're going to pick up a lot of things. And you're going to have to figure out, you know, when and where does the search warrant apply, to privacy in terms of paparazzi usages of this system.
Armed side, we haven't seen that in actual armed systems, although there was you would be surprised which state this happened. A sheriff's department that bought a version of a quadcopter that was armed, you could arm it with shotguns, but they just put nonlethal weapon systems on it. So there is that potential I don't think that's where we're going because we -- police helicopters don't operate armed normally. I wouldn't see the same thing happening with drones, but we have -- let's move to ground robotics.
We do have questions of do I have a right to bear robotic arms? A owner of a bar in Atlanta there was a group of homeless who gathered just like here on the other side of this street from the bar. He didn't like it, thought it was bad for business. He didn't want to personally confront them, so he build what he called bombard, which basically was a ground robotic armed with a water canon that he sent out to chase the homeless away.
Was that his right to do or not? We could argue back and forth to thinking about home defense et cetera, et cetera, big, big questions coming out of this. The point I'd make is, you know, the computer back in 1980 didn't seem like it'd be an issue that would -- technology that would cause all sorts of legal issues and yet a huge number of Supreme Court cases have connected to computers. We'll see the same thing with the robotics.
Just right there. Wait for the mic.
SPEAKER: Given the volume of countries that have some sort of robotics and given the velocity of development of technology I realize I am asking you to speculate, but how long do you think it will be before truly we'll have a war between enemies that is essentially robot against robot?
MR. SINGER: It's a really great question and people often -- and it's funny, it happened at literally one of the panels here at Aspen, and they go -- and they make this sci-fi reference. There is a Star Trek episode where that exactly happens where they basically have advanced to a point that just the machines fight and then the people from the two cultures line up and the computer spits out how many people would die and they accept their deaths instead of all the destruction of war.
And Kirk of course it's just -- thinks this is the most horrible thing and destroys the system to leave them back to the good old fashioned ways of war. I don't think that will happen anytime soon because at the end of the day this is a technology that still connects to us, still connects to humans. And we're seeing not complete replacement of military units, but more a mixing of manned and unmanned.
Where I think it's headed is more towards what we call in the field of war fighters associates. The parallel of this would be a policeman with a police dog. The team is better than each of them on their own. Humans doing what they are good at, robots doing what they are doing at, but the team together or sports parallel would be a quarterback and the wide receivers, the human being the quarterback.
The quarterback calls the play, the wide receivers carry out the play, but we give the wide receivers autonomy so that they don't just blindly follow the play if things change, if they think the quarterback is about to be sacked, they can turn around and come back or the like. And I think we'll do the same with our robotic systems, where we say, here is the place for you to follow, but we're going to allow them to come back.
Another reason, why I don't see complete replacement is of course there are vulnerabilities in these technologies. Some of the vulnerabilities are high-tech one likes I mentioned in cyber. But also a lot of war today is messy, it's insurgency, warlords, urban and so you're seeing low-end responses to robotics.
One for example is in Iraq the insurgents studied our ground robots and figured out the exact angle that they could crawl out of a ditch and the angle that they couldn't. And so they dug ditches, tiger traps; an old technology that actually was used in Vietnam, as well by the Vietcong; to capture our ground robots and a couple of times then used them against us.
Or they were really, really mean in one way. They figured out the exact height of the arm of one ground robot of what it could reach and then they would put the IED a half a inch above that. And so you'd see the American robot just trying to reach at it and, you know, of course they could have put it at a foot above, but they put it a half inch above, a kind of a psychology back at the Americans.
I mean this is a -- we're talking about technology, but as psychologists say, we're learning and we'll get back, we don't need robots to figure out. So you're going to see this back and forth and back and forth. I mean what I would say is you get the tank, but it doesn't mean you got rid of all these other technologies out there in war. Robotics though is definitely, you know, a game changer.
Okay, we've got time. I think that's our -- wait I'm getting the end sign. So I'll follow my orders here like a good robot. And thank you all for coming out.
* * * * *