Robots Develop Emotions
|A sad robot|
The first prototype robots capable of developing emotions as they interact with their human caregivers and expressing a whole range of emotions have been finalized by researchers. Led by Lola Cañamero at the University of Hertfordshire, and in collaboration with a consortium of universities and robotic companies across Europe, these robots differ from others in the way that they form attachments, interact and express emotion through bodily expression.
Developed as part of the interdisciplinary project FEELIX GROWING (Feel, Interact, eXpress: a Global approach to development with Interdisciplinary Grounding), funded by the European Commission and coordinated by Cañamero, the robots have been developed so that they learn to interact with and respond to humans in a similar way as children learn to do it, and use the same types of expressive and behavioral cues that babies use to learn to interact socially and emotionally with others.
The robots have been created through modeling the early attachment process that human and chimpanzee infants undergo with their caregivers when they develop a preference for a primary caregiver. They are programmed to learn to adapt to the actions and mood of their human caregivers, and to become particularly attached to an individual who interacts with the robot in a way that is particularly suited to its personality profile and learning needs. The more they interact, and are given the appropriate feedback and level of engagement from the human caregiver, the stronger the bond developed and the amount learned.
The robots are capable of expressing anger, fear, sadness, happiness, excitement and pride and will demonstrate very visible distress if the caregiver fails to provide them comfort when confronted by a stressful situation with which they cannot cope or to interact with them when they need it.
“This behavior is modeled on what a young child does,” said Cañamero. “This is also very similar to the way chimpanzees and other non-human primates develop affective bonds with their caregivers.”
This is the first time that early attachment models of human and non-human primates have been used to program robots that develop emotions in interaction with humans.
“We are working on non-verbal cues and the emotions are revealed through physical postures, gestures and movements of the body rather than facial or verbal expression,” Cañamero added.
The researchers led by Cañamero at the University of Hertfordshire are now extending the prototype further and adapting it as part of the EU project ALIZ-E, which will develop robots that learn to be carer/companion for diabetic children in hospital settings. Within this project, coordinated by Tony Belpaeme of the University of Plymouth, the Hertfordshire group will lead research related to the emotions and non-linguistic behavior of the robots. The future robot companions will combine non-linguistic and linguistic communication to interact with the children and become increasingly adapted to their individual profiles in order to support both therapeutic aspects of their treatment and their social and emotional wellbeing.
The FEELIX GROWING project has been funded by the Sixth Framework Programme of the European Commission. The other partners in the project are: Centre National de la Recherche Scientifique (France), Université de Cergy Pontoise (France), Ecole Polytechnique Fédérale de Lausanne (Switzerland), University of Portsmouth (U.K.), Institute of Communication and Computer Systems (Greece), Entertainment Robotics (Denmark), and Aldebaran Robotics (France).