Social Robot Prototype Can Share Its Emotions

Robot uses a textured "skin" to express how it feels.

Do you ever wonder what your robot might be feeling?

Engineers and robotics researchers at Cornell University have developed a robot prototype that can express “emotions” through changes in its outer surface. The robot’s skin covers a grid of texture units whose shapes change based on the robot’s feelings.

The inspiration for designing a robot capable of giving nonverbal cues through its outer skin comes from the animal world, based on the idea that robots shouldn’t be thought of in human terms, explains Guy Hoffman, an assistant professor of mechanical and aerospace engineering at Cornell. Hoffman has given a TEDx talk on “Robots with ‘soul’,” and believes that robots shouldn’t just be modeled after humans or be copies of humans.  

“We have a lot of interesting relationships with other species,” Hoffman said. “Robots could be thought of as one of those ‘other species,’ not trying to copy what we do but interacting with us with their own language, tapping into our own instincts.”

Hoffman and doctoral student Yuhan Hu’s robot design features an array of two shapes, goosebumps and spikes, which map to different emotional states. The actuation units for both shapes are integrated into texture modules, with fluidic chambers connecting bumps of the same kind.

The robot prototype expresses its

The robot prototype expresses its “anger” with both its eyes and its skin, which turns spiky through fluidic actuators that are inflated under its skin, based on its “mood.” (Image courtesy of Lindsay France/University Photography.)

The team tried two different actuation control systems, with minimizing size and noise level a driving factor in both designs. “One of the challenges,” Hoffman said, “is that a lot of shape-changing technologies are quite loud, due to the pumps involved, and these make them also quite bulky.”

Hoffman does not have a specific application for his robot with texture-changing skin mapped to its emotional state. At this point, just proving that this can be done is a sizable first step. “It’s really just giving us another way to think about how robots could be designed,” he said.

Future challenges include scaling the technology to fit into a self-contained robot – whatever shape that robot takes – and making the technology more responsive to the robot’s immediate emotional changes.

“At the moment, most social robots express [their] internal state only by using facial expressions and gestures,” the paper concludes. “We believe that the integration of a texture-changing skin, combining both haptic [feel] and visual modalities, can significantly enhance the expressive spectrum of robots for social interaction.”

Their work is detailed in a paper, “Soft Skin Texture Modulation for Social Robots,” presented at the International Conference on Soft Robotics in Livorno, Italy. Doctoral student Yuhan Hu was lead author, and the paper was featured in IEEE Spectrum, a publication of the Institute of Electrical and Electronics Engineers.

For more on interactive and expressive robots, check out these stories:

Aido Wants To Be Your Next Household Robot

Robots Can Do Anything, But Jazz Is For Humans

Source: Cornell University Newsroom