Machine Learning Powered Wearable Soft Robot for Patients with Limited Hand Mobility

Researchers in South Korea have developed a wearable soft robotic device that assists patients with impaired hand mobility to grasp and release objects. The researchers devised a machine-learning algorithm to predict user intentions, which helps patients to use the device more intuitively.

By receiving input from a camera mounted on the user’s glasses, the machine-learning algorithm can predict what that person is attempting to do, and instruct the soft robotic device to assist appropriately. For instance, if a user is attempting to grasp something, the software can detect this from the camera feed by assessing arm movement and the distance to the object.

Once the software has determined that the user wishes to grasp the object, it can activate soft actuators to provide an appropriate amount of assistive force to the user’s fingers. The system also includes a computer to allow the machine-learning algorithm to function and an actuation module to help move the hand robot. The current system is a prototype and the researchers would like to miniaturize it so that it can be easily transported by a patient.

The technology is conceived for use by patients with impaired hand mobility, such as those with spinal cord injuries, stroke or cerebral palsy. Losing hand mobility can make day-to-day tasks difficult or impossible, and developing assistive technologies could significantly enhance the quality of life of such patients.

At present, the technology does present with some limitations. For example, if the object is obscured from the camera or out of its range of view, the device will not be able to assist a user. “The algorithm needs to be improved by incorporating other sensor information or other existing intention detection methods, such as using an electromyography sensor or tracking eye gaze,” said Professor Kyu-Jin Cho, a researcher involved in the study.

So far, the system has allowed a spinal cord injury sufferer to perform a variety of daily tasks that would have previously not been possible, such as grasping and drinking a cup of coffee. The system does not need to be calibrated by a technician for each user and can adapt to the movements of each individual.

Here’s a video from Seoul National University’s Soft Robotics Research Center about the latest development:

Study in Science Robotics: Eyes are faster than hands: A soft wearable robot learns user intention from the egocentric view…

Via: Seoul National University…