New Stretchable Sensor Could Bring Touch to Robots, AR/VR

Researchers have developed a stretchable sensor to mimic the human sense of touch.

Stretchable sensors that can detect deformations could give robots a better sense of “touch.” (Image courtesy of Cornell University.)

Stretchable sensors that can detect deformations could give robots a better sense of “touch.” (Image courtesy of Cornell University.)

Robots and virtual reality/augmented reality (AR/VR) technologies may have come a long way, but they are still missing a vital human component: the sense of touch. In a recently published paper, Cornell University researchers have developed rudimentary stretchable sensors that just might change that.

“Right now, sensing is done mostly by vision,” said Rob Shepherd, associate professor of mechanical and aerospace engineering in the College of Engineering. “We hardly ever measure touch in real life. This skin is a way to allow ourselves and machines to measure tactile interactions in a way that we now currently use the cameras in our phones. It’s using vision to measure touch. This is the most convenient and practical way to do it in a scalable way.”

The team based its latest creation on earlier research at the university’s Organic Robotics Lab. Along with developing innovative sensory materials, including hybrid foams, an earlier version had already established light as an optimal solution. After the sensor was sent through an optical waveguide, changes in the beam’s intensity were detected by a photodiode. Although deformations in the light were detected, other elements were not.

To counter that, Colead author Hedan Bai focused on using silica-based fiber-optic sensors to enable further shifts in wavelengths to determine humidity, temperature and strain. The challenge was overcoming the incompatibility between silica fibers and soft, stretchable electronics.

“We know that soft matters can be deformed in a very complicated, combinational way, and there are a lot of deformations happening at the same time,” Bai said. “We wanted a sensor that could decouple these.”

The team’s worked evolved into creating a stretchable lightguide for multimodal sensing (SLIMS). Instead of distributed fiber-optic sensors, which involve the use of high-resolution detection equipment, their innovation works with lower resolution small optoelectronics. Along with being more cost-effective and easier to manufacture, these devices are also an ideal solution for smaller systems.

Cornell researchers in the Organic Robotics Lab designed a 3D-printed glove lined with stretchable fiber-optic sensors that use light to detect a range of deformations in real time. (Image courtesy of Cornell University.)

Cornell researchers in the Organic Robotics Lab designed a 3D-printed glove lined with stretchable fiber-optic sensors that use light to detect a range of deformations in real time. (Image courtesy of Cornell University.)

SLIMS is a long tube with two polyurethane elastomeric cores. One core is transparent, while the other has absorbing dyes in various locations and is connected to LEDs. A sensor chip in both cores enables registering geometric changes in the optical light path that light up when deformations, such as pressure or bending, are detected. The teamed created a mathematical model to separate different deformations and determine their location and magnitude.

To test their wearable technology, the team made a 3D-printed glove with Bluetooth, a lithium battery and basic circuitry that was lined with SLIMS sensors. The data is transmitted to software developed by Bai, which reconstructs the movements in real time.

Although the technology is only in the early stages, the team hopes to eventually commercialize it for use in physical therapy and sports medicine, as well as AR/VR experiences and robotics.

“Let’s say you want to have an augmented reality simulation that teaches you how to fix your car or change a tire,” Shepherd said. “If you had a glove or something that could measure pressure, as well as motion, that augmented reality visualization could say, ‘Turn and then stop so you don’t overtighten your lug nuts.’ There’s nothing out there that does that right now, but this is an avenue to do it.”

The ability to have a human sense of touch would better mimic those experiences by allowing the sense of coming into contact with an object. For robotics, the potential could be a game changer when it comes to giving robots the ability to sense their surroundings and handle objects.

Interested in more AR/VR or robotics innovations? Check out Soft Humanoid Hands Help Robots Grasp Fragile Objects and Invisible Computing: Mojo Vision Is Developing an AR Contact Lens.