Skip to main content

Researchers in UNC/NC State BME and NC State Electrical and Computer Engineering have recently developed new software that can be integrated with existing hardware to enable people using robotic prosthetics or exoskeletons to walk in a safer, more natural manner on different types of terrain. The new framework includes robust artificial intelligence (AI) algorithms that allow the software to better account for uncertainty.

Recently published in IEEE Transactions on Automation Science and Engineering, the paper, “Environmental Context Prediction for Lower Limb Prostheses with Uncertainty Quantification,” is co-authored by BME Jackson Family Distinguished Professor Helen Huang and BME Ph.D. student Minhan Li; ECE Associate Professor Edgar Lobaton, recent ECE Ph.D. graduate Boxuan Zhong, and ECE Ph.D. student Rafael da Silva.

In their study, the researchers used cameras worn on eyeglasses and cameras mounted on the lower-limb prosthesis itself. The researchers evaluated how the AI was able to make use of computer vision data from both types of camera, separately and when used together. “Incorporating computer vision into control software for wearable robotics is an exciting new area of research,” says Helen Huang. “We found that using both cameras worked well, but required a great deal of computing power and may be cost prohibitive. However, we also found that using only the camera mounted on the lower limb worked pretty well – particularly for near-term predictions, such as what the terrain would be like for the next step or two.”

To train the AI system, researchers connected the cameras to able-bodied individuals, who then walked through a variety of indoor and outdoor environments. The researchers then had a person with lower-limb amputation wear the cameras while traversing the same environments. “We found that the model can be appropriately transferred so the system can operate with subjects from different populations,” Lobaton says. “That means that the AI worked well even though it was trained by one group of people and used by somebody different.” The next step is to test the new framework in a robotic prosthetic device. To read the full story, visit WRAL TechWire website here.

Comments are closed.