To make people more comfortable and trusting of robots that may one day assist with work or aid in daily life, researchers are modeling involuntary aspects of human behavior, like blinking or reacting to unexpected situations, to help robots seem more natural.
Our reactions to bright light, our heart rate, the number of times we blink, or excitement are all controlled sub-consciously by the nervous system. These involuntary physiological reactions are influenced by the time of day, our environment, and the many biological molecules called neuroendocrine substances, such as neurotransmitters and hormones, that flow through our bodies.
A team of researchers in Spain believe that infusing robots with these involuntary behaviors will make social human-robot interactions feel more natural and will reduce the reluctance some people feel when engaging with machines.
Their model for an artificial neuroendocrine system that controls involuntary behavior was recently published in Advanced Intelligent Systems.
Building social robots
“A social robot is a machine with different degrees of intelligence that is intended to interact with people, assisting them in applications like healthcare, education, or companionship,” explained Marcos Maroto-Gómez, lead author of this new research and post-doctoral researcher at University Carlos III of Madrid.
This work, developed with co-workers Álvaro Castro-González, María Malfaz, and Miguel Ángel Salichs, uses the social robot system called Mini, which was designed to assist the elderly in cognitive stimulation exercises using social and multi-sensory experiences, such as games or playing music.
For this to succeed, users must willfully engage with the robot as they would a human. With this goal in mind, Maroto-Gómez and colleagues set out to create an artificial neuroendocrine model so Mini could regulate and express involuntary physiological functions. This improves realism as the robot subtly reacts to the environment and the user in a recognizable and familiar way.
“We believe that the robot has to simulate natural behaviors so the human will trust it and use it frequently,” said Maroto-Gómez.
Physical robot, artificial hormones
The team began with five involuntary processes for Mini to express: heart rate, pupil size, breathing rate, blinking rate, and locomotor activity, all controlled by five artificial substances modelled on natural molecules found in our bodies, such as serotonin and dopamine.
Using data from human studies which show the natural ebb and flow of neurotransmitters and hormones throughout the day, the team first built a biological clock for the robot. This set the baseline levels and ranges for the five artificial substances in a 24-hour period.
With this in place, they then created a neuroendocrine model of how external stimuli measured by the robots’ sensors would influence the levels of the artificial hormones and produce the behavioral involuntary response.
This may seem overly complicated. For example, if the goal is to have the robot’s pupils dilate in response to light, one could simply program pupil size directly in response to light intensity. Why add in the extra layer of artificial hormones and neurotransmitters?
According to Maroto-Gómez, the neuroendocrine model allows them to link many different stimuli and responses together — something you cannot achieve by simply mapping stimuli to high level processes like pupil dilation.
In humans, hormones and neurotransmitters are simultaneously involved in many different stimulus and response pathways that may not be obvious. Without the mesh of the neuroendocrine model connecting all these pathways in the robot, these subtleties would be lost.
“For example, it doesn’t make sense to relate pupil size with anger because there is no relation at that level,” explained Maroto-Gómez, “that relationship exists at the neuroendocrine level.”
An artificial system of neuroendocrine substances captures this detail, and as the model improves, will allow for a greater range of autonomous reactions.
Robots with a biological clock
The team plans to expand the number of neuroendocrine substances, processes, and stimuli that are possible, and to use learning algorithms like deep reinforcement learning, which would allow the model to learn and improve based on real time experiences rather than relying on a fixed dataset of human hormone levels.
Maroto-Gómez is also working to improve the realism of the biological clock model. “I’m trying to simulate a circadian clock that regulates all the neuroendocrine substances and therefore the biological processes dynamically,” he said.
Again, rather than a fixed pattern of serotonin or dopamine levels based on human data, “we can adapt the robot behavior to the light conditions, we can dynamically change the robot mood or emotions, depending on the weather conditions, if it is a cloudy day, or a sunny day,” said Maroto-Gómez.
In theory, this type of dynamic circadian clock could allow the robot to express jet lag when it changes time zones and environmental conditions. “In fact, we can make the robot adapt to a jet lag situation, and that’s exciting,” said Maroto-Gómez.
Unlike robots that work on predefined parameters and complete only specific tasks, Maroto-Gómez aims to build fully autonomous robots that run indefinitely while constantly adapting their behavior to the environment and stimuli from its human counterparts.
According to Maroto-Gómez, advances in sensors and actuators, the physical components that take in information and translate signals into movement, need to keep pace with the modelling and AI learning components.
“The sensors are essential and from my point of view those robots that can perceive more information from the environment work better.”
To learn from or react to the environment the robot needs to first perceive it. Maroto-Gómez believes that with improved sensor and actuators, “the behavior it [the robot] will exhibit will be richer and the possibilities for designers are broader.”
Reference: Marcos Maroto-Gómez, et al., Modeling Neuroendocrine Autonomic Responses in Embodied Autonomous Robots, Advanced Intelligent Systems (2022), DOI: 10.1002/aisy.202200288
Feature image credit: Marcos Maroto-Gómez, et al.