How Robots Can Feel with Optical Waveguides
Michael Alba posted on December 16, 2016 |
Researchers develop prosthetic hand that uses optical waveguides to feel its surroundings internally. (Image courtesy of Huichan Zhao.)
Researchers develop prosthetic hand that uses optical waveguides to feel its surroundings internally. (Image courtesy of Huichan Zhao.)

Researchers from Cornel University have developed stretchable, optical waveguides to allow robots to feel their surroundings by measuring external strain. The new technology offers an alternative to the rigid and bulky tactile sensors seen in other robots.

Ultimately, the researchers built a soft prosthetic hand to demonstrate their findings.

An optical waveguide is a structure that guides visible light, like optical fiber.

By constructing a waveguide out of a stretchable material, researchers produced what’s called an elastomeric optical waveguide, which can easily deform. It is the capacity for deformation that allows elastomeric optical waveguides to be used as tactical sensors.

The research team used a four-step, soft lithography process to produce their stretchable optical waveguides. By intentionally designing them to be lossy, their waveguides took on a desirable property: the more the waveguide deforms, the more light is lost to the environment.

Using a photodiode, researchers could detect the amount of light travelling through the waveguide. Since more deformation results in less light, researchers could figure out the amount of deformation on the waveguide and thus sense external strain.

“If no light was lost when we bend the prosthesis, we wouldn’t get any information about the state of the sensor,” said researcher Robert Shepherd. “The amount of loss is dependent on how it’s bent.”

Illustration of the prosthetic hand built with optical waveguides. (Image courtesy of Science Robotics.)
Illustration of the prosthetic hand built with optical waveguides. (Image courtesy of Science Robotics.)

The team combined several waveguides to create a prosthetic hand, which they subjected to several tactile experiments. In one, the hand was tasked with recreating the surface of seven differently shaped objects, with faithful results. In another, the hand felt three different tomatoes and successfully determined which one was the ripest.



"Most robots today have sensors on the outside of the body that detect things from the surface,” said researcher Huichan Zhao. “Our sensors are integrated within the body, so they can actually detect forces being transmitted through the thickness of the robot, a lot like we and all organisms do when we feel pain, for example.”

The new technology has applications in prosthetics, but could also be implemented in bio-inspired robots for the purposes of space exploration. However, further research is needed to improve the sensory capabilities of the waveguides, as well as to improve touch localization by incorporating machine learning techniques.

This research was supported by a grant from Air Force Office of Scientific Research and made use of the Cornell NanoScale Science and Technology Facility and the Cornell Center for Materials Research, both of which are supported by the National Science Foundation.

You can read the team’s full paper here. For more robotics news, read Travelling to China? Your Customs Agent Could be a Robot.

Recommended For You