Dr. Julie Carpenter is a leading expert on robot-human relationships. She made headlines in early October when she released her groundbreaking new study on the emotional ties between military personnel and military robots. As a result of the study, which revealed that soldiers often name and even fall in love with their robots, and hold funerals for robots that have been destroyed, the public came face-to-face with a reality that had previously only been considered in the realm of science fiction. As autonomous systems proliferate in both the military and civilian spheres, Carpenter’s research will become extremely important; it will help us understand not only how we will have feelings for robots, but also the ethical, social and practical consequences of developing relationships with machines. Carpenter has inaugurated a complex and perhaps uncomfortable discussion about an issue which, bizarre as it may seem to us now, will eventually touch everyone who interacts with autonomous machines.
Julie Carpenter recently completed a doctorate in Education at the University of Washington. She agreed to an in-depth interview with the Center for the Study of the Drone to discuss Pygmalion, humanoids, movies and love.
Interview by Arthur Holland Michel.
Center for the Study of the Drone Have you ever personally felt emotions for a robot, or can you at least empathize with a soldier who feels something for his/her robot?
Julie Carpenter Several years ago at a human-robot interaction academic conference, I volunteered to be a participant in a study taking place on the spot. The researchers were looking at, among other things, people’s comfort levels with a humanlike domestic robot approaching them from different directions (left, right, from behind, and so on). They were using a PeopleBot model, if I remember correctly. As instructed, I sat in a chair and the robot approached me holding a tray with a cup on it. When it first approached me from the right (I’m right-handed), my instinct was to thank the robot for bringing me the cup. Of course, as soon as I felt myself fight that urge I felt silly. At the same time, I knew it was my natural and socialized instinct to respond politely to a thing bringing me something, a “thing” performing a humanlike task. Because the study scenario was repeated at the conference, I watched other professional and academically trained roboticists respond in a similar way as I had when the robot approached them. We all chuckled, and we all acknowledged how easy it is to slip into a social dynamic with a robot. I’ve also played video games and felt the avatar I used regularly was in some ways an extension of myself in a virtual environment but also in a real social space. So, yes, I can empathize with a soldier who interacts socially with a robot, or even has a type of affection for a particular robot, or applies a sense of self into the robot much like you might with a gaming avatar.
Drone Do you believe that feeling attachment to a machine is normal human behavior, or is it aberrant?
Carpenter I think there are individual cases where a person feeling attachment toward a machine indicates something abnormal or aberrant because of that particular person’s mental health history. I’m not a psychologist, so I won’t go into attempting to define that type of relationship. However, I believe the Explosive Ordnance Disposal personnel I spoke with displayed very normal emotions and reactions toward the machines they work with every day in close proximity, which they rely on to keep them safe, and which they take care of (for example, by charging its batteries). The machine is embodied and moves, and it performs human- or animal-like tasks such as de-fusing unexploded ordnance or conducting reconnaissance.
Drone People have been falling in love with their inanimate creations throughout history, all the way back to Pygmalion. Is this just a continuation of that trend, or a more extreme version of it? Or is this something entirely new?
Carpenter I refer to a history of storytelling about human-artificial human relationships in my dissertation, too. Many, if not all cultures, have similar stories. From golems to Pinocchio to Frankenstein’s creature, and Astro Boy, R2-D2, Wall-E, the Terminator. The list goes on and on. The interactions may vary from friendly to destructive, but the underlying thread is that these relationships exist in our mythologies and are believable in the course of the stories; we don’t really question the human-nonhuman aspect of it because we can understand how the interaction might occur and we relate to it somehow. We may have questions about the outcomes of these fictional relationships and whether they are good or bad, but we understand how the relationships happen initially. There are complex cultural reasons why we return to variations of these story themes, but I think one important way these characters continue to influence us is by framing our expectations of how we might interact with an artificial lifelike being when we do actually meet with one in person. However, our expectations are not static things. While the stories of human-robot relationships may set up an idea of how we might interact with a fictional being, when actually faced with a real-life scenario with a robot, it’s likely our expectations will change over time.
One theme I found in my work with Explosive Ordnance Disposal personnel was when discussing their expectations about working with robots prior to actually working with them was that they did, in fact, change over time. Their prior experience with robots before the military varied from no experience at all to building robots from hobby kits to working with industrial robots. However, being trained on and working with specific models of EOD robots made them very aware of their capabilities and limitations. While the soldiers acknowledge the restrictions of the EOD robots, it didn’t stop them from offering human or animal-like treatment of the robots.
Drone David Levy proposes that amorous relationships between humans and machines will become more common, and more extreme. In your study, one of the soldiers is described as having taken a nap with his robot, as though it was his girlfriend. Do you think that this raises ethical concerns?
Carpenter One soldier related a story about a teammate napping against his robot on the way to a mission and jokingly characterized the relationship as having an amorous parallel. I didn’t see that as problematic in itself. Here’s that part of the interview:
Researcher: Did you ever name any of the robots?
Connor: [laughs] Every single one.
Researcher: Can you tell me their names? Tell me more about that.
Connor: It was more just a way to be funny and keep our morale up. Towards the end of our tour we were spending more time outside the wire sleeping in our trucks than we were inside. We’d sleep inside our trucks outside the wire for a good five to six days out of the week and it was three men in the truck, you know, one laid across the front seats; the other lays across the turret. And we can’t download sensitive items and leave them outside the truck. Everything has to be locked up, so our TALON was in the center aisle of our truck and our junior guy named it Danielle so he’d have a woman to cuddle with at night.
Researcher: OK, do you have any other examples like that?
Connor: Well, Danielle got blown up so obviously she needed to be replaced. I don’t know…We’d name them after movie stars that we see at theater, or music artists, somebody popular, and then we’d always go to vote to decide on. (Connor, 22, SGT, Army)
I should mention that not only is “Connor” a pseudonym to protect his privacy, but I also decided to change the robot names in the transcripts.
Drone Why did you change the robot names?
Carpenter The idea of changing a robot’s name in a published paper is humorous, but I really wanted to do whatever I needed to do to protect the privacy of the folks who generously shared their experiences with me and trusted me with their stories. Within the U.S. military, EOD is actually not a huge field in terms of numbers, although it is growing. Since all branches have part of their initial training at Eglin Air Force Base, and because of the relatively small number of EOD personnel in existence, they often know each other either personally from work, via word-of-mouth, or through social networking. Using a robot’s actual name might have easily revealed the identities of the soldiers that participated in the study, and what they shared with me. I can’t think of another study where the robots’ names had to be changed to protect operator privacy, but it probably won’t be the last time it happens.
Drone You focused your study on soldiers who operate terrestrial robots. Can you imagine soldiers having similar feelings for an unmanned aerial vehicle?
Carpenter I think it’s more likely that UAV operators will experience an extension of self into the robot than some of the other aspects of a human-robot relationship that operators of terrestrial robots described, although they may also go through some variations of similar social actions. This extension of self is not a thing to be minimized. The EOD robot operators I spoke with often described a sense of loss (along with other emotions) if their robot was disabled, and frustration or even a sense of personal failure if the robot was unsuccessful, even if it was through no fault of the operator.
For UAV operators, I think it’s possible we’ll find similar feelings as well as new ones. I sometimes hear comments proclaiming how “easy” it is for UAV operators to cause destruction from a great distance, but recent research is actually demonstrating that to not be the case at all. Because drones have cameras, the operators may see any destruction as a result from their operations more closely than, say, a traditional pilot would. For the sake of discussion, if an operator has any sense of self inserted into that UAV, it’s also possible viewing the results of drone operations will impact operators in many ways, psychologically—short- and long-term.
Drone Boston Dynamics makes robots that resemble quadrupedal animals. Biomimicry is an expanding area of robotics. Do you think that it is foolhardy to develop machines that so closely resemble creatures?
Carpenter I do not think it is foolhardy at all. From a development standpoint, I think it makes perfect sense to model a robot on something nature has made to work so perfectly in certain environments for certain tasks. There are many advantages to biped or quadruped robots, such as the ability to move with agility on all different sorts of terrain. Bipedal robots often move better within spaces made for humans, such as houses or a submarines. Robots with human-like hands may be able to use tools already in existence rather than requiring us to re-invent a special set of tools, and can more efficiently perform tasks with things made by human hands or for human hands, such as Improvised Explosive Devices or doorknobs. Often robots take over roles previously carried out by either humans or animals, such as reconnaissance, load carrying, or nuanced dexterous things like dealing with unexploded ordnance. The integration of tools with human- or animal-like design into human teams appears to be naturally triggering an instinct to apply some socialness to our human-robot interactions; a socialness that goes beyond how we might interact with other tools. I believe this needs to be investigated and the findings applied to the evolving design of hardware, training, and exploration of long-term impact on the human operators.
Drone How do you propose to continue and or/expand this research, and are there other interesting aspects of this topic that might be fruitful to explore?
Carpenter I’m fascinated by how people work with different field robots in the dynamic contexts of defense, whether the robots are semi-autonomous, drones, exoskeletons, mecha, human- or animal-like. The robots are changing so many human-centered parts of the processes rapidly: team structure, individual roles and training, social system dynamics, and mission outcomes. All of these areas interest me, and will need to be explored as long as we are using robots. My EOD research indicated there are findings worth investigating further, including variables such as operator age and how it affects human-robot socialness, as well as possible interaction models coming formally from the organization or social system as a whole that may influence how EOD interact with robots in teamwork.
At this point in the world of academic and industry robotics research, emphasis is still largely placed on advancing the engineering side of robotics rather than human-oriented parts of the interactions that aren’t ergonomic or purely physical in regards to design issues. During development, I prefer to see the human part of the human-robot interactions (HRI) researched in tandem with the engineering parts, instead of regarded as a module in the beginning or after the process. In academia, I like to see many ways of investigating the human end included in human-robot interaction curriculum, even if it’s a general introduction of the idea that there are many valuable ways of exploring the issues. The current paradigm is not an easy hurdle for researchers like myself who believe there is value in investigating the human psychological and sociological issues in human-robot interactions, and that these factors do impact every aspect of design and use, as well as the more global issues of ethics or long-term outcomes of the interactions. The field of human-robot interaction is still relatively new and finding and creating its collective identity in academic departments, industry, scholarly journals, and popular media. Theoretically, HRI is interdisciplinary, but in practice I’d argue we’re not quite there.
Let me give you a current industry example. Recently, the U.S. Special Operations Command issued a Broad Agency Announcement (BAA) for proposals and research in support of the development of Tactical Assault Light Operator Suit (TALOS)—what the media refers to as the “Iron Man Suit.” The BAA description I have seen is an inclusive call to agencies, academia, and individuals—as long as the proposals are related to the potential technologies planned for TALOS, such as the advanced armor, command and control computers, power generators, enhanced mobility, and other parts of the comprehensive wearable system. That’s wonderful! Now, where is the BAA that asks for similar research proposals about the psychological aspects of the people actually wearing these proposed suits? In addition to any physical requirements that may be needed for wearing TALOS, what sort of person will be able to effectively use TALOS? The skills that make an effective Special Ops soldier may not necessarily be the skills that match with the proposed functions of the suit. Now, I’m not trying to pick on the U.S. Special Operations Command; I am not part of the TALOS decision-making team, nor do I have insider knowledge about the TALOS project. Perhaps they have a team of psychologists and social scientists and human factors experts in place that will work as part of the TALOS development team. I certainly hope so, because it makes sense to build an exoskeleton with an understanding of who can use it as you develop it iteratively. Understanding the human side means more efficiently developing the suit design in general: who will wear the suit (maybe not even who they envision as a typical TALOS user right now), how they can most effectively recruit people who can best work within TALOS, and develop the appropriate training for TALOS users.
In a way, I feel like perhaps part of my role is to be an advocate for the human part of the human-robot interactions. I feel robotics is and should be an interdisciplinary activity or endeavor in order to design and build the most effective, interesting, efficient robots for any use scenario. But then again, I’ve been saying this in one way or another since at least 2005.
Drone What policy recommendations might you make, based on this research?
Carpenter There are already individuals and organizations taking on the tasks of exploring the policy side, such as the South Korean Robot Ethics Charter and EURON Roboethics Roadmap. My global recommendation is to continue to monitor human interactions and explore the human-robot dynamic in many ways and through different lenses and research approaches: via psychology, ethnography, sociology, design, and art, to name a few. Why limit how we research and develop such important technology that –either directly or indirectly–impacts us all?
Drone Beyond academic research, how should we as respond to the prospect of human-robot love; that is, how should we respond as a society?
Like David Levy, I envision that there will be benefits and drawbacks to human-robot relationships. Using a robot as a means to therapeutically aid people can have so many exciting possibilities, as others have reported after working with robots and children with autism or the elderly.
If you are speaking about a romantic human-to-robot “love”—and I’m using quotation marks because of course it is, at this point, a non-reciprocating dynamic—I think if the love for a robot becomes a barrier to someone’s interactions with other people or otherwise inhibits their happiness or functioning, that situation will need to be handled case-by-case. We can already get a glimpse of how society at large might perceive human-robot romantic relationships when we read about or know people who have similar models of human-nonhuman attachment or affection for things such as RealDolls, otaku superfandom for manga or anime characters, or people creating their own strong social connections via virtual contexts like SecondLife or World of Warcraft. There are contemporary movies like Lars and the Real Girl or AI, which explore outcomes of human-artificial human relationships. How should we as a society respond to human-robot “love”? I’d say with optimism and an open mind, balanced with critical thinking. To me, critical thinking doesn’t mean shutting down the progression of technology or social options, it simply means a continuation of the discussion by examining and questioning and responding as we move forward.