Human Loneliness & Robot Bonding

3254

In a recent report published in BBC Capital’s “Augmented Reality” column some real life technological scenarios which might affect us in the near future are articulated. One such scenario is having a robot colleague in real life and how to bond with it as a person.

Boomer, The Military Robot

If you had visited Taji, Iraq in 2013 – well, you might have seen something peculiar. The site lies an hour north of Baghdad and is home to a US military base, with dusty floors and formidable concrete walls. It is in this brutal environment that, following a lethal explosion, a group of soldiers tenderly remembered their fallen comrade. He just so happened to be a robot.

To all who knew him, this brave hero was affectionately nicknamed Boomer. He had saved many lives during his service, by going ahead of the team to search for lurking bombs that had been laid by the enemy. At his funeral, Boomer was decorated with two medals, the prestigious Purple Heart and Bronze Star, and his metallic remains were laid to rest with a 21-gun salute.

Boomer was a MARCbot, a military robot that looks a bit like a toy truck with a long neck, on which a camera is mounted. They’re relatively affordable for robots – they’re each about $19,000, or £14,000 – and not particularly difficult to replace. And yet, this team of soldiers had bonded with theirs. When he died, they mourned him like they would a beloved pet.

Retirement Party for Mail Robots

Fast-forward a few years and this story isn’t as unusual as you might think. In January 2017, workers at CBC, the Canadian Broadcasting Corporation, threw a retirement party for five mail robots. Rasputin, Basher, Move It or Lose It, Maze Mobile and Mom had been pacing the company’s hallways for 25 years – delivering employee mail, making cute noises and regularly bumping into people.

There were a cake. There were balloons. There was a nostalgic farewell video. There was even a leaving card with comments like “Thanks for making every day memorable” and “Beep! Beep! Beep!” The robots will likely spend their final years relaxing at one of the many museums that have requested them.

Though they’re often portrayed as calculating job-stealers, it seems that there’s another side to the rise of the robots. From adorably clumsy office androids to precocious factory robots, we can’t help bonding with the machinery we work with. We feel sorry for our non-human colleagues when things go wrong, project personalities onto them, give them names and even debate over their gender. One medical robot-in-training, Sophia, has been granted citizenship of Saudi Arabia.

Likeable Robots

Not all collaborative robots, or “cobots”, were designed to be likeable. Many are just rectangular boxes, that lack faces, the ability to speak, as well as any artificial intelligence. Why do we care about them? And what does it mean for the future of work?

“When I first got this particular job, one of my colleagues had actually helped to design one of the robots I worked with,” says Olivia Osborne, a scientist specialising in nanotechnology at the University of California, Los Angeles (UCLA). “And he was the one who said, ‘Oh, this one’s got a mind of its own. It’s called Zelda’.”

Zelda, the research photographer

Zelda’s job was to take photos of zebrafish embryos. These images could then be analysed by Osborne, who was studying the effects of toxic nanoparticles on their ability to develop normally. “You’re in a room literally just with machinery, and you start to get attached to it. You kind of feel sorry for it, because it’s not getting anything, apart from electricity, right?”

Robot Behaviour

At the heart of all these unlikely friendships is the natural human tendency to personify all kinds of entities, including animals, plants, gods, the weather, and inanimate objects. At one end of the scale, this can lead to comparisons between peppers and politicians. At the other, it can lead to videos of polar bears petting dogs going viral.

In the right conditions, we’ll even ascribe personalities to rocks. In one experiment that won the Ig Nobel Prize (a humorous award given to silly or strange achievements in science), apparently some rocks were like “a big New York type businessman, rich, smooth, maybe a little shady”, while others were “a hippie”. Students were shown pictures of rocks and then asked which traits applied to them. To the researchers’ surprise, they had no trouble with this and each rock had a distinct personality.

But when it comes to robots, this behaviour reaches spectacular new heights. In many cases, we aren’t just humanising them, but empathising with them. Last year, the internet was alight with concern for a “suicidal” security robot that had “drowned” itself in the pond at a shopping centre in Georgetown, Washington DC. Steve the Knightscope security robot, who looks like a cross between a Doctor Who Dalek and Star Wars’ R2-D2, was left in a critical condition after stumbling on some steps. Its fellow co-workers rushed to its aid and dramatic footage of its rescue was captured by crowds of onlookers.

Bonding With Robots

In fact, our empathy for them has some striking parallels with our feelings for fellow humans. In 2013, a team of scientists at Germany’s University of Duisburg-Essen scanned the brains of volunteers while they watched people being affectionate or violent towards a human, a robot, and an inanimate object. One staged scenario involved putting the victim’s “head” into a plastic bag and strangling them, while others included hugs and massages.

Though they didn’t feel quite as bad for the robots as they did for people, the same brain areas were active in the volunteers while watching the robots and the humans being tortured. In another study, the same team found that we have a tangible physical reaction to watching robots being harmed.

How to connect with robots?

If we’re going to go a step beyond simple empathy and actually befriend our robot colleagues, it’s thought that we need one of three things to happen. First of all, we need a motive.

Throughout human history, we have anointed canons, swords, boats – and, more recently, equipment such as cars, wind turbines, and robots – with human names. “A lot of this sort of usage goes back to people’s way of trying to relate to huge machines that are very difficult to handle, very treacherous,” says Peter McClure, who studies naming at the University of Nottingham. “They sort of christen them or nickname them, in order to exercise some sort of control over them. A sort of prophylactic thing, you know?”

One example of this led to the coining of the word gun. Back in 12th-Century England, “Gunnild” was a popular name for a woman. A couple of hundred years later, this old Norse word – which meant “battle” – was given to a mechanical crossbow that defended Windsor Castle, the Lady Gunnilda. As its usage evolved, it was shortened to “gun” and given to hand-held firearms, which were themselves extremely dangerous and unpredictable.

The Naming Prejudice

Indeed, McClure has noticed that machines tend to be given female names, possibly for sexist reasons. “I suspect that there’s some attempt to exercise male control over the female,” he says. In the modern world, this might explain the tradition of naming tunnel-boring machines – giant, 150-metre long monstrosities with several rows of sharp teeth – after women. The £14.8 billion ($19.6 billion) Crossrail project was dug by Ada, Phyllis, Victoria, Elizabeth, Mary, Sophia, Jessica and Ellie.

Human Loneliness and Robot Bonding

At the extreme end, this tendency to humanise the machines we rely on may lead to real emotional connections, as it did with Boomer in Iraq. “People anthropomorphise – get inside the heads of – objects constantly, and considering an object like a robot as human if that robot has an integral part in your survival is not that surprising,” says Lasana Harris, a psychologist at University College London.

Just like with other humans, it seems that these connections are strengthened by shared trauma. Mourning lost military robots isn’t at all unusual; on one occasion, the manufacturers were reportedly been sent a box of robot remains, along with a note saying “can you fix it?”

But another common motive is loneliness. Way back in our evolutionary past, seeking out other people to bond with was vital to our survival. This is thought to be the reason that social isolation or rejections, such as break-ups, often manifest themselves as physical pain; our bodies will do everything in their power to encourage us to make friends and keep them.

When humans are unavailable, our social needs must be met elsewhere. This may be a volleyball on a desert island, or a robot in an empty lab. According to a report in Wired magazine, some people buy Roomba robotic vacuum cleaners for lonely relatives, to keep them company. One retired professor who lived alone considered hers as her companion.

The Tangible Similarities

Finally, there need to be some tangible similarities between the robot and a human, so that our imaginations have something to go on. This might be the headlights and cooling grill of a car, which look like a face, or the ungainly attempts of a robot trying to place a box on a desk, repeatedly failing, then failing over.

“If the object is unpredictable in its behaviour, such as a car that won’t start, or exceedingly animate, behaving in a way that suggests self-propelled motion or agency, then it is more likely to be anthropomorphised,” says Harris. “These effects can increase if there are very few similar objects behaving this way, and if the object’s behaviour is observed in lots of different situations.”

Again, an example might be the Roomba, which research shows is easily personified – despite the fact that it’s just a black and white disc that makes beeping noises. In a 2010 study, actors were filmed while they pretended to be vacuum cleaners with a variety of personalities, such as “bold” or “careless”.

Then these videos were used by scientists to program the cleaners to give them these traits. For example, a calm robot might make less noise. When a group of Dutch members of the public were asked to guess each robot’s personality, they were surprisingly accurate.

The Dangerous Side

Once an object has been humanised, our relationships with them are remarkably similar to those we have with other humans. And this is where things start to get dangerous.

For a start, we’re susceptible to the same psychological biases. Just like people, research shows that robots are more likeable when they make mistakes. For example, participants in one study preferred the robots they were working with on a task when they violated social norms, by saying something odd, or malfunctioning, by providing faulty instructions.

In this documentary by the BBC, there’s a revealing moment at the end where Li Yan, a migrant worker at an Alibaba packing centre in China, describes her feelings about the robots she works with. “I feel like the robots are like humans. They can have errors and emotions as well. They will need humans to pay attention to them and to monitor them.”

It wouldn’t be ideal if people formed stronger bonds with their robot colleagues when they messed up tasks or turned out to be rubbish at their jobs. Many hospitals have begun hiring robot nurses to deliver drugs to patients. Though they’re just boxes on wheels, they’re remarkably human – able to open doors and call elevators, and ask for help when they get stuck. But what if they delivered the wrong drug to a patient?

It’s easy to envisage a scenario where even the wholesome bonds between soldiers and bomb disposal robots could become a problem. From running into gunfire to braving IEDs, military history is littered with the stories of heroes who paid the ultimate price to save their friends.

If soldiers view their robot colleagues as people, this might mean feeling that they’re due the same protection from harm. After all, the opposite process – dehumanising – has been used for thousands of years to justify violence towards enemy troops. For example, during the Rwandan genocide, persecuted tribes were often compared to animals.

Falling In Love With Robots

There’s already been talk of the possibility that humans could fall in love with robots, which would open up another set of sticky ethical problems. The EU is currently considering whether the most sophisticated robots, those with artificial intelligence, should be deemed “electronic persons” and granted certain human rights.

But mostly, bonding with our robot colleagues is surely a good thing. Osborne was actually given the option of a human lab assistant, but preferred to work with Zelda, who she was less likely to argue with. “I had days when I was like this is awesome, we’re a great team,” she says. The robot was human enough to bond with, but also had some decidedly superhuman qualities, such as correcting Osborne’s mistakes.

Their Cleverness Outshines

“Sometimes I’d put in a wavelength [of light] that I wanted it to take a photo with – say I wanted the red wavelength – and it would be like ‘hmm, I don’t think you want that wavelength! I think you want 444 nanometres, or something’ and you’re like ‘I do want that, yes…’,” she says. “I could have gone through a whole ream of wavelengths and wondered why it wasn’t working. That’s something people need to realise – they’re very clever.”

As robots enter the workplace, people are beginning to realise that they can be valuable allies – with many of the benefits of a companion and co-worker, but less of the politics. Boomer’s funeral may have been the first for a robot, but it surely won’t be the last. RIP.

Did you subscribe for our daily newsletter?

It’s Free! Click here to Subscribe!

Source: BBC

LEAVE A REPLY