Hard Bionics

Smart artificial hand combines user and robotic control for assistive solution

Image: EPFL

Scientists at Ecole Polytechnique Fédérale de Lausanne (EPFL) in Switzerland have developed a smart robotic hand to help amputees in daily tasks. The artificial hand combines individual finger control and automation for improved grasping and manipulation.

This interdisciplinary proof-of-concept between neuroengineering and robotics was successfully tested on three amputees and seven healthy subjects. Implementing these two concepts together, the technology contributes to the emerging field of shared control in neuroprosthetics. The results were published in Nature Machine Intelligence, reports EPFL.

The robotic hand is intelligent enough to decipher the user’s intentions and can grasp an object and maintain contact with it for robust grasping. Such automation may help the system to be more skillful, innate and less cumbersome than previous robotic prostheses.

“When you hold an object in your hand, and it starts to slip, you only have a couple of milliseconds to react,” explains Aude Billard who leads EPFL’s Learning Algorithms and Systems Laboratory. “The robotic hand has the ability to react within 400 milliseconds. Equipped with pressure sensors all along the fingers, it can react and stabilize the object before the brain can actually perceive that the object is slipping.”

Image credit: EPFL

The machine learning algorithm, developed by the researchers, first learns how to decode user intention and translates this into finger movement of the prosthetic hand. The amputee must perform a series of hand movements in order to train the algorithm that uses machine learning. Sensors placed on the amputee’s stump detect muscular activity, and the algorithm learns which hand movements correspond to which patterns of muscular activity. Once the user’s intended finger movements are understood, this information can be used to control individual fingers of the prosthetic hand.

“Because muscle signals can be noisy, we need a machine learning algorithm that extracts meaningful activity from those muscles and interprets them into movements,” says Katie Zhuang first author of the publication.

Next, the scientists engineered the algorithm so that robotic automation kicks in when the user tries to grasp an object. The algorithm tells the prosthetic hand to close its fingers when an object is in contact with sensors on the surface of the prosthetic hand.

Source: www.wearable-technologies.com

Related posts

A prosthetic leg with ‘feeling’ improves mobility


First-of-its-kind platform aims to rapidly advance prosthetics