Emotional attachment to robots could affect outcome on battlefield

September 19, 2013

By Doree Armstrong, UW 爆走黑料 and Information

Too busy to vacuum your living room? Let Roomba the robot do it. Don鈥檛 want to risk a soldier鈥檚 life to disable an explosive? Let a robot do it.

It鈥檚 becoming more common to have robots sub in for humans to do dirty or sometimes dangerous work. But researchers are finding that in some cases, people have started to treat robots like pets, friends, or even as an extension of themselves. That raises the question, if a soldier attaches human or animal-like characteristics to a field robot, can it affect how they use the robot? What if they 鈥渃are鈥 too much about the robot to send it into a dangerous situation?

That鈥檚 what Julie Carpenter, who just received her UW doctorate in education, wanted to know. She interviewed Explosive Ordnance Disposal military personnel 鈥 highly trained soldiers who use robots to disarm explosives 鈥 about how they feel about the robots they work with every day. Part of her research involved determining if the relationship these soldiers have with field robots could affect their decision-making ability and, therefore, mission outcomes. In short, even though the robot isn鈥檛 human, how would a soldier feel if their robot got damaged or blown up?

A United States Army explosive ordnance disposal robot pulls the wire of a suspected improvised explosive device in Iraq.

What Carpenter found is that troops鈥 relationships with robots continue to evolve as the technology changes. Soldiers told her that attachment to their robots didn鈥檛 affect their performance, yet acknowledged they felt a range of emotions such as frustration, anger and even sadness when their field robot was destroyed. That makes Carpenter wonder whether outcomes on the battlefield could potentially be compromised by human-robot attachment, or the feeling of self-extension into the robot described by some operators. She hopes the military looks at these issues when designing the next generation of field robots.

Carpenter, who is now turning her dissertation into a book on human-robot interactions, interviewed 23 explosive ordnance personnel 鈥 22 men and one woman 鈥 from all over the United States and from every branch of the military.
These troops are trained to defuse chemical, biological, radiological and nuclear weapons, as well as roadside bombs. They provide security for high-ranking officials, including the president, and are a critical part of security at large international events. The soldiers rely on robots to detect, inspect and sometimes disarm explosives, and to do advance scouting and reconnaissance. The robots are thought of as important tools to lessen the risk to human lives.

Some soldiers told Carpenter they could tell who was operating the robot by how it moved. In fact, some robot operators reported they saw their robots as an extension of themselves and felt frustrated with technical limitations or mechanical issues because it reflected badly on them.

The pros to using robots are obvious: They minimize the risk to human life; they鈥檙e impervious to chemical and biological weapons; they don鈥檛 have emotions to get in the way of the task at hand; and they don鈥檛 get tired like humans do. But robots sometimes have technical issues or break down, and they don鈥檛 have humanlike mobility, so it鈥檚 sometimes more effective for soldiers to work directly with explosive devices.

Researchers have previously documented just how attached people can get to inanimate objects, be it a car or a child鈥檚 teddy bear. While the personnel in Carpenter鈥檚 study all defined a robot as a mechanical tool, they also often anthropomorphized them, assigning robots human or animal-like attributes, including gender, and displayed a kind of empathy toward the machines.

鈥淭hey were very clear it was a tool, but at the same time, patterns in their responses indicated they sometimes interacted with the robots in ways similar to a human or pet,鈥 Carpenter said.

Many of the soldiers she talked to named their robots, usually after a celebrity or current wife or girlfriend (never an ex). Some even painted the robot鈥檚 name on the side. Even so, the soldiers told Carpenter the chance of the robot being destroyed did not affect their decision-making over whether to send their robot into harm鈥檚 way.

Soldiers told Carpenter their first reaction to a robot being blown up was anger at losing an expensive piece of equipment, but some also described a feeling of loss. 鈥淭hey would say they were angry when a robot became disabled because it is an important tool, but then they would add 鈥榩oor little guy,鈥 or they鈥檇 say they had a funeral for it,鈥 Carpenter said. 鈥淭hese robots are critical tools they maintain, rely on, and use daily. They are also tools that happen to move around and act as a stand-in for a team member, keeping Explosive Ordnance Disposal personnel at a safer distance from harm.

The robots these soldiers currently use don鈥檛 look at all like a person or animal, but the military is moving toward more human and animal lookalike robots, which would be more agile, and better able to climb stairs and maneuver in narrow spaces and on challenging natural terrain. Carpenter wonders how that human or animal-like look will affect soldiers鈥 ability to make rational decisions, especially if a soldier begins to treat the robot with affection akin to a pet or partner.

鈥淵ou don鈥檛 want someone to hesitate using one of these robots if they have feelings toward the robot that goes beyond a tool,鈥 she said. 鈥淚f you feel emotionally attached to something, it will affect your decision-making.鈥