A new epidermal VR interface imparts a sense of touch through electronic signal.
Engineers at Northwestern University have created an epidermal virtual reality (VR) interface that accurately imparts a telesynthetic sense of touch to wearers.
For decades, those interested in virtual reality have longed for a more immersive experience when traversing their world apart. While the senses of sight and sound have been digitally mastered, it’s taken engineers quite a while to digitize our three other recognized senses in a way indistinguishable from reality. However, a new, thin, wireless technology could be used to claim another sense for the realm of VR.
“We are expanding the boundaries and capabilities of virtual and augmented reality,” said Northwestern’s Yonggang Huang.
Through the use of stretchable electronics, Huang and his research partner John Rogers have created a wearable device that employs millimeter scale actuators and wireless power transfer protocols to stimulate the nerve endings of users’ skin.
Using a custom graphic user interface and an off-the-shelf touch screen input device, researchers can transmit a gesture from the input source, compile it and output it as an identical motion through their skin-like interface.
Looking forward, Huang and Rogers believe there are many applications for their interface.
“You could imagine that sensing virtual touch while on a video call with your family may become ubiquitous in the foreseeable future,” Huang said.
However, Huang’s colleague Rogers did note that his technology is still in a nascent phase and that more development will lead to more sophisticated applications for the interface. “We feel that it’s a good starting point that will scale naturally to full-body systems and hundreds or thousands of discrete, programmable actuators.”
Currently, Rogers and Huang are putting their technology through its paces with U.S. Army veteran and combat amputee Garrett Anderson who lost his right arm during an IED explosion in Iraq.
During tests, the joint between Anderson’s arm and prosthetic was fitted with a prototype of the epidermal VR interface which was then connected to the fingers of his prosthetic. Anderson was tasked with handling several objects and was asked to describe if he could sense any sensation that corresponded to the grip he was establishing on the target object.
According to Anderson, he could.
Huang and Rogers believe that through further use Anderson may be able to remap his sense of touch through this device and eventually establish a new neural map of his body that recognizes his prosthetic— by way of the interface—as a surrogate for his lost limb.
If that development were to come to pass, then the boundary between virtual reality and whatever it is that we call this world would be truly blurred, and it’s likely that it’ll take a philosopher, not an engineer, to solve that dilemma.