Humanizing the Robotic Hand at ASU – A Moonshot Project

Teaching robotic hands to have a human grip takes hard work and lots of data.

Dr. Veronica Santos wants robotic hands to have a soft human grip instead of a crushing robotic clamp. Santos and the Biomechatronics Department at Arizona State University discuss the project and demonstrate a hand prototype in her SolveForX talk, Researchers Create Helping Hands.

Robotic hands don’t have hinges and joints the way that human hands do, instead requiring motors and voltage for their movements. Using campus volunteers the team takes motion capture data from fingers and joints to set the baseline. Fingertip force data is also logged and occasionally the muscle movement sounds are recorded as audio data.


https://asunews.asu.edu/files/imagecache/story_main_image/images/Veronica_Santos_Robot_Arm_cropped.jpg

The data allows the team to develop an algorithm that mimics the human hand gripping and object. Instead of a binary open-close condition the hand now has information to allow a more human clamping motion. When users cannot fully see the object being gripped the haptic feedback gives a new world of possibilities.

Prosthetics are the prominent application for this technology but Veronica discusses a few other applications in the video. Deep space, deep sea, and military uses are also mentioned. The goal is to create a technology for use any time that a human wants to control an artificial grasper to manipulate objects.

This project is more than a decade in the making for Dr. Santos, who published a paper on thumb kinematics back in 2003. In late 2013 as part of the Adventures in Engineering Lecture Series at ASU Santos gave an excellent presentation to freshman engineering students about the project and the next steps for her team.


https://asunews.asu.edu/files/imagecache/story_main_image/robotic_hand_125w_1.jpg