How Robots Can Detect Human Emotions

As robots increasingly work alongside humans, the ability for the machines to understand the desires and intentions of their human companions will be vital.  A new study from Warwick Business School highlights how robots might soon be able to recognize human emotions.

The researchers discovered that humans are relatively easily able to detect emotions such as sadness and boredom from the way we move, even if our facial expression or voice are hidden from us.  The team believe robots could learn similar capabilities.

“One of the main goals in the field of human-robot interaction is to create machines that can recognise human emotions and respond accordingly,” the researchers explain.  “Our results suggest it is reasonable to expect a machine learning algorithm, and consequently a robot, to recognise a range of emotions and social interactions using movements, poses, and facial expressions. The potential applications are huge.”

Detecting emotion

The researchers filmed children playing with a robot and a computer that was built into a table, before then showing the footage to several hundred volunteers, who were tasked with reporting whether they thought the children looked bored, sad or excited.  They were also asked whether the children were competing or cooperating, and whether one of the children had taken on a dominant role in the pairing.

The study was made interesting in that half of the volunteers saw the full footage, whereas half saw a version whereby the two children were reduced to stick figures.  The figures had exactly the same movements, but obviously things like facial expressions were absent.

Despite this apparent obstacle, both groups assigned the same emotional labels to the children they were watching in the majority of instances.

Training the machine

These results were then used to train a machine learning algorithm to be able to accurately label the clips and identify the kind of social interactions contained within them, including the emotions on display and how the child was feeling.

“Robot delivery services are already being trialed, but people tend to attack or vandalize them, often because they feel threatened,” the authors explain.  “The aim is to create a robot that can react to human emotions in difficult situations and get itself out of trouble without having to be monitored or told what to do.  That doesn’t happen at the moment, because robots tend to follow scripts.”

They believe that by being able to detect emotions such as stress, together with their strength and severity it will enable machines to have much smarter interactions with us.

“Different levels of stress require different responses. Low level stress might just require the robot to back away from the human, while a high level of stress might be best addressed by having the robot apologise and leave the interaction,” they conclude.

Facebooktwitterredditpinterestlinkedinmail