Talking about your feelings can be difficult. Now imagine if you’re a robot.

For social robots to make their way into everyday life, the machines will need to effectively communicate their internal state and react less like machines – or even humans.

With some inspiration from a bird’s ruffled feathers and blowfish’s protruding spikes, Cornell University’s Human-Robot Collaboration and Companionship Lab, led by mechanical and aerospace engineering Professor Guy Hoffman, developed a prototype skin that allows a robot to express its emotions, so to speak.

The soft material can change its texture through a combination of goosebumps and spikes that map to a robot's varying conditions.

The research is detailed in a paper, “Soft Skin Texture Modulation for Social Robots ,” presented in April at the International Conference on Soft Robotics in Livorno, Italy.

Lead author and doctoral student Yuhan Hu spoke with Tech Briefs about what kinds of new interactions are possible when a human knows how a robot “feels.”

Tech Briefs: What inspired you to create this robotic skin?

Yuhan Hu: In this work, we got inspirations from the animal kingdom, where some species may change skin textures when they are facing an external or internal stimulus, such as cats’ neck fur raising, the protrusion of spikes of blowfish, and a bird’s ruffling feathers. We found this widespread and easily readable behavior has not been utilized to design a robot’s nonverbal behavior. We think this can offer new kinds of interactions between human and robots.

Tech Briefs: Why design a robot that gives off nonverbal cues?

Yuhan Hu: Nonverbal cues play a central role in human communications. For social robotics, it’s also important for robots to understand and use the human communication cues. Research in human/robot interaction shows that the ability of robots to use nonverbal behaviors can enhance their expressing effectiveness, and improve a user’s experience by providing more familiar and less machine-like behaviors.

The robot prototype expresses its "anger" with both its eyes and its skin, which turns spiky through fluidic actuators that are inflated under its skin, based on its "mood." (Image Credit: Lindsay France/University Photography)

Tech Briefs: What kinds of nonverbal cues are being emulated?

Yuhan Hu: Like the blowfish grows spikes to send a signal of “don’t touch me,” we found the shape of textures are useful to send the cues of whether the robot is in a positive state or not. We are also looking at the cues sent by dynamics of the texture changes, including frequencies and amplitudes, that can be naturally mapped to human experiences. For example, our heart rate and breathing increases to a higher frequency level when we are in an excited state.

Tech Briefs: How does the robotic skin you developed allow a robot to express “emotions” through changes in its outer surface?

Yuhan Hu: The textured skin consists of multiple textured units, arranged in a grid. Each textured unit is made of an elastomer with a hollow core, where the unit can change its surface shape in response to its internal pressure, similar to a balloon. The textured units can be customized by designing the shape of the cores as well as embedding haptic expressive tips on the top. All the units with same shapes are connected through an internal fluidic chamber. The initial flattened skin can grow expressive shapes with both haptic and visual effects under pressurized air.

Tech Briefs: How do the shapes correspond to specific emotions?

Yuhan Hu: We conducted user experiments to map the “emotions” to texture changes, with frequency, amplitude, and the shape of textures as control variables. We are testing whether those texture expressions can be consistently perceived as certain emotions by users via experiments.

Tech Briefs: Can you take us through an application you envision with this kind of capability?

Yuhan Hu: We hope that this tactile system can first enhance the expressing effectiveness of robots, and improve communications by allowing users to see and touch the robots’ “feelings.” Furthermore, these kinds of expressions would be useful when visual and auditory channels are not available. For example, it can help people with visual impairments to communicate with their household robot. Or this could help humans to communicate with robots in emergency situations when visibility is blocked due to environmental conditions.

Tech Briefs: What role do you see social robots playing in our everyday lives, especially robots with this kind of skin?

Yuhan Hu: Social robots serve as social interfaces between robots and humans, being able to use natural communication cues to make it easy for us to communicate effectively with the robot. In general situations, we wish such social cues in the form of skin changes could strengthen users’ understanding of a robot’s intention. For example, a spiking skin may give us clues that the robot is in a “no touch” state, and sensation of sharp tips under palms will have more of an emotional effect on us when compared to a dislike face on the screen.

What do you think? What kinds of communication can be achieved with robotic bumps and spikes? Will this kind of skin lead to new kinds of interactions between human and robots? Share your comments below.