Monday, December 23, 2024
spot_img
HomeArtificial & Replacement Organs & TissuesSmart Artificial Hand Merges User And Robotic Control

Smart Artificial Hand Merges User And Robotic Control

Control of robotic hands is improved in this approach by combining individual finger control and automation for improved grasping and manipulation. As published in Nature Machine Intelligence the interdisciplinary proof of concept was successfully tested in 3 amputees and 7 healthy subjects, which is hoped to contribute to the emerging field.

The approach involves a machine learning algorithm deciphering intended finger movement from muscular activity on the amputee’s stump or individual finger control of the prosthetic hand, and allows the robotic hand to help take hold of objects and maintain contact with it for more robust grasping.

“When you hold an object in your hand, and it starts to slip, you only have a couple of milliseconds to react,” explains Aude Billard who leads EPFL’s Learning Algorithms and Systems Laboratory. “The robotic hand has the ability to react within 400 milliseconds. Equipped with pressure sensors all along the fingers, it can react and stabilize the object before the brain can actually perceive that the object is slipping.”

The algorithm must learn how to decode user intention and then translate this into finger movement of the prosthetic hand for the shared control to work effectively which is accomplished when the amputee performs a series of hand movements to train the algorithm using machine learning. Sensors are placed on an amputee’s stump to detect muscular activity to enable the algorithm to learn which hand movements correspond to certain patterns of muscular activity; once the intended finger movements are understood the information can then be used to control individual fingers of the prosthetic hand. 

“Because muscle signals can be noisy, we need a machine learning algorithm that extracts meaningful activity from those muscles and interprets them into movements,” says first author Katie Zhuang.

The algorithm begins robotic automation when the user tries to grasp an object by telling the prosthetic hand to close the fingers when an object is in contact with the sensors on the surface of the prosthetic; automatic grasping has been built on from a previous study designed to detect the shape of objects and grasp them based on tactile information without visual signals.

Unfortunately this advancement is still currently under development, as there are still challenges to address in the algorithm before it can be implemented in a commercially available prosthetic hand which are currently still being tested. 

“Our shared approach to control robotic hands could be used in several neuroprosthetic applications such as bionic hand prostheses and brain-to-machine interfaces, increasing the clinical impact and usability of these devices,” Silvestro Micera, EPFL’s Bertarelli Foundation Chair in Translational Neuroengineering, and Professor of Bioelectronics at Scuola Superiore Sant’Anna.

RELATED ARTICLES

Most Popular