Robotic Hand Combines Amputee and Robotic Control for Assistive Solution

Researchers at Ecole Polytechnique Fédérale de Lausanne in Switzerland have developed an intelligent robotic hand to assist amputees in daily tasks. The research team used existing robotic hardware, but developed a machine learning approach to provide better control for amputees, whereby the robotic arms can better anticipate user intentions, even down to individual finger movements. In a process called “shared control,” the intelligent arm can automatically control certain movements, such as grasping and manipulation, thus combining both robotic and user control for an improved user experience.

Researchers around the world are developing various assistive technologies for amputees. However, this latest development sees a robot and an amputee work together, which has never been tried before. Using this system, the amputee lets the robot know their intended finger movements using sensors on their residual limb that measure muscular activity. This is translated to individual finger control on a prosthetic hand.

However, the
robot is also intelligent enough to decipher the user’s intentions and has a
level of automation, whereby it can, for example, grasp an object and maintain contact
with it for as long as desired. Such automation may help the system to be more
dexterous and intuitive, and less clumsy than previous robotic prostheses.

“When you
hold an object in your hand, and it starts to slip, you only have a couple of
milliseconds to react,” said Aude Billard, one of the researchers involved in
the study. “The robotic hand has the ability to react within 400 milliseconds.
Equipped with pressure sensors all along the fingers, it can react and
stabilize the object before the brain can actually perceive that the object is
slipping.”

To allow the system to work in tandem with them, amputees first train it to recognize their intentions using sensors mounted on their residual limbs that measure muscular activity as they perform a set of maneuvers, including individual finger movements. “Because muscle signals can be noisy, we need a machine learning algorithm that extracts meaningful activity from those muscles and interprets them into movements,” said Katie Zhuang, another researcher involved in the study.

The
automation kicks in when the user attempts specific tasks, such as grasping an
object. Sensors on the prosthetic tell the device when the user is attempting
to grasp something, and the hand will automatically close around it, grasping
it firmly.

“Our shared
approach to control robotic hands could be used in several neuroprosthetic
applications such as bionic hand prostheses and brain-to-machine interfaces,
increasing the clinical impact and usability of these devices,” said Silvestro
Micera, a third researcher involved in the study.

See a video demonstrating how the system works in action:

Study in Nature
Machine Intelligence
: Shared human–robot
proportional control of a dexterous myoelectric prosthesis

Via: EPFL