Can we make robots that experience emotions?

Sadly, robots just aren’t feeling it

By Meg Murphy

We can create robots that look like they have feelings, says Rosalind Picard, a professor of media arts and sciences at MIT and founder and director of the Affective Computing research group at the MIT Media Lab. The thing is—we’re talking appearances, not reality.

In 2001, an engineer from Sony, for example, built a dog named Aibo. It looks happy when you come home and sad when you scold it. As Picard says, a person might think, “Gee, my dog has feelings.” The truth, however, is that Aibo responds to a computer program, not you. It matches a set of inputs with an automated output. “Aibo has a robotic tail, for instance, and the tail is programmed. The owner greets it and the program runs the wagging tail movement. Or the owner says ‘bad dog!’ and the program stops the tail,” she says. “The dog does not have any conscious experience or feelings that go with being happy or sad.”’

In 2008, the researchers made Shybot, a personal mobile robot designed to both embody and elicit reflection on shyness behaviors. It was a favorite with the public. Using face detection and proximity sensing, the robot was designed to react to human presence and familiarity. It categorized people as friends or strangers. “When a person came near it, it scurried backward unless it really got to know you,” says Picard. “Depending on your approach, it could look like a nervous, anxious robot.”

“It’s easy to make robots look like they are feeling all kinds of things,” adds Picard. “If you want it to look more afraid, for instance, you give it a little more jitter. If it has eyes, you make the eyes open up more.” Mimicking emotional signifiers, Picard adds, is not exactly new. “In the movies, we see robots with different personalities. They feel real but are the product of acting. In a similar vein, we can create the illusion of consciousness.”

The technology is growing more sophisticated by the day, she says. Robots can now seem to share our mindsets. For example, when feeling happy, people are more likely to search for creative explanations — to think outside of the box. Robots are often programmed to search and make remote associations, creating a similar impression.

“The robot gives you responses that seem creative and playful and similar to those we have when happy,” says Picard. “It appears as if the robot is thinking.” But don’t think of this as autonomy, she warns. The machine is using memory processes that are simply “a little more than smiling and wagging its tail.”

Thanks to Abhik Maji, from India, for the question.

Posted: October 02, 2017

[contact-form-7 id="442" title="Submit Question"]
popupimg

title

content Link link