
Harvard engineers have pioneered a unique wearable robot that adapts to user movements, providing personalised assistance to enhance daily activities for stroke and ALS patients
Harvard researchers have developed an innovative wearable robot designed to assist individuals with upper-limb impairments, including stroke and ALS patients. Using machine learning, the device personalises movement support, helping users perform everyday activities such as eating, drinking, and reaching, while providing a new approach to tailored rehabilitation.
The research is published in Nature Communications.
Years of research have led to smarter, gentler robotic assistance
For the last several years, Harvard bioengineers have been developing a soft, wearable robot that provides movement assistance and augment therapies to help them regain mobility. However, physical motions are highly individualised, especially for those with mobility impairments. However, the engineers found that machine learning advances can create a personalised touch.
Researchers in the John A. Paulson School of Engineering and Applied Sciences (SEAS), in collaboration with physician-scientists at Massachusetts General Hospital and Harvard Medical School, have enhanced their wearable robot to respond precisely to an individual user’s movements.
The study was led by Conor Walsh, whose lab develops human-centered assistive robots for individuals with movement impairments. Over the course of six years, Walsh collaborated with stroke expert Dr. David Lin and ALS specialist Dr. Sabrina Paganoni at Massachusetts General Hospital to develop clinically relevant devices. Paganoni emphasised the importance of incorporating both clinician and patient perspectives from the earliest prototypes.
Machine learning personalised assistance
The researchers describe a significant update to the software powering the device, which consists of a sensor-loaded vest with a balloon attached underneath the arm that inflates and deflates to apply mechanical assistance to a weak or impaired limb. The team used a machine learning model that personalises assistance levels for each user by learning which movements the user is attempting to perform, via sensors tracking both motion and pressure.
In previous versions of the device, which only tracked motion, the researchers found that users had trouble lowering their arm once the robot had helped lift it. “Some people didn’t have enough residual strength to overcome any kind of mistake the robot was making,” explained co-first author and graduate student James Arnold.
The new wearable robot incorporates a physics-based model that they had previously developed to estimate the minimum pressure needed to support the arm during movement. This makes the robot’s assistance feel natural to the user, providing help with tasks such as eating and drinking. By combining the physics model with machine learning, the wearable robot can quickly adjust the level of assistance it provides at any time, based on what it has learned about how the user typically moves.
“For people living with ALS, the most important considerations include comfort, ease of use, and the ability of the device to adapt to their specific needs and movement patterns,” Paganoni said. “Personalization is crucial to enhance their functional independence and quality of life … This device holds the potential to significantly improve upper limb function, enhance daily living activities, and reduce compensatory movements.”
Wearable robot reduces effort and enhances daily activity performance
The researchers found that a wearable robot trained on an individual’s movement data could distinguish the user’s shoulder movements with 94% accuracy. The amount of force required to lower a person’s arm was reduced by about a third compared to previous versions. The users also exhibited larger ranges of motion in their shoulders, elbows, and wrists, thereby reducing the need to compensate with body leaning or twisting, and making their movements overall more precise and efficient.
Past studies with the wearable robot had focused on a single joint or a single clinical score for evaluating patient movement, explained co-first author and postdoctoral fellow Prabhat Pathak. “What we did here was look at simulated activities of daily living, using a highly accurate motion capture system — similar to systems used in movies. We looked at how every joint movement changed, and if they were able to do the tasks more efficiently.”