Get a Grip: Artificial Intelligence Improves Prosthetic Hand Control

Researchers use AI to make hand prostheses more intuitive via the synergy principle.

The advancements made in prosthetics in recent years are nothing short of astounding. Not so long ago, needing a prosthesis meant accepting significant reductions in mobility or dexterity. These days, artificial extremities are made from lighter, more durable materials and often incorporate sensors, microprocessors and other electronics to enable more natural movements. Of course, there’s still more work to be done, as recently demonstrated by Cristina Piazza and her research team at the Technical University of Munich (TUM).

A professor of rehabilitation and assistive robotics, Piazza and her coauthors have published a paper in which they demonstrate how artificial intelligence (AI) can make hand prostheses more intuitive through a combination of 128 sensors on the forearm and a concept from neuroscience known as the synergy principle.

“It is known from neuroscientific studies that repetitive patterns are observed in experimental sessions, both in kinematics and muscle activation,” explained Piazza in a press release. Essentially, the synergy principle states that muscles in our forearms are also activated when we use our hands. “When we use our hands to grasp an object, for example a ball, we move our fingers in a synchronized way and adapt to the shape of the object when contact occurs,” said Piazza.

By creating new machine learning algorithms that can interpret data from the sensors on a patient’s forearm, Piazza and her colleagues have made controlling prosthetic hands much more intuitive. “With the help of machine learning, we can understand the variations among subjects and improve the control adaptability over time and the learning process,” added Patricia Capsi Morales, the senior scientist on Piazza’s team.

The 128 forearm sensors are divided between two films on either side of the limb. “The more sensors we use, the better we can record information from different muscle groups and find out which muscle activations are responsible for which hand movements,” Piazza explained.

Unfortunately, there are still some significant obstacles to overcome before AI-powered prosthetics can become practical. For one, the learning algorithm needs to be retrained whenever the films slip or are removed. This means that every time a new user wears the sensors, the algorithm must first identify the activation patterns for each movement sequence, in order to detect the user’s intention and translate it into commands for the artificial hand.

Still, with the rapid advancements taking place in AI—especially in edge computing—biomedical engineers and prosthetics designers would be well served by keeping an eye on this line of research. It may not be long before every artificial hand comes with its very own artificial intelligence.

The team’s research was presented and published in the IEEE International Conference on Rehabilitation and Robotics 2023.

Written by

Ian Wright

Ian is a senior editor at engineering.com, covering additive manufacturing and 3D printing, artificial intelligence, and advanced manufacturing. Ian holds bachelors and masters degrees in philosophy from McMaster University and spent six years pursuing a doctoral degree at York University before withdrawing in good standing.