Veterans who wear prosthetic lower legs could experience healthier and more comfortable lifestyles following novel artificial intelligence research at New Jersey Institute of Technology, in conjunction with the Veterans Affairs New York Harbor Healthcare System and Rutgers New Jersey Medical School.
Approximately 75% of patients with lower limb loss experience skin problems such as blisters, calluses, irritations, ulcers or wounds where their remaining limb attaches to their prosthetic one, despite existing best practices in the development of liners that help hold the prosthetic in place and ease these issues, VA biomechanical engineer Jason Maikos said.
“No socket fits 100% perfectly on somebody's limb. It’s impossible to do that. So there is movement of that residual limb inside the prosthetic socket. Some of that movement is up and down, some of it is rotating. What happens is, if there's too much excessive movement on the residual limb, on the skin specifically, you can get skin abrasions, you can get skin wounds, and for people who have complications of diabetes, this can be very problematic,” he said.
“If you have an open wound, and you have trouble healing quickly, you can't wear the prosthetic device. So now you no longer can walk without crutches, which is not great, or you're now ambulating in a wheelchair, or you're not ambulating at all.”
Maikos studies these conditions using a tool called dynamic stereo x-ray, or DSX, which sees internal images from two sides instead of just one by tracking barium-infused stickers that are placed on the patient. DSX provides insights about how a patient’s natural remaining bone and skin move inside the prosthetic liner. The problem isn’t how to understand the insights, it’s how to handle the deluge of data.
That’s where NJIT comes in. Each use of the DSX system takes about one day to process. Assistant Prof. Salam Daher and her students are devising a way to analyze the images with AI. “Salam's grant comes in and it speeds up. Theoretically, a trial that would maybe take my engineer about a day to track [decreases] to roughly 15 minutes” — a major advancement toward eventually moving DSX out of the experimental phase and into real-world clinical applications, Maikos explained.
Daher explained that each camera might provide 1,000 images with 50 markers. In stereo, “That’s a lot of human time to go and annotate, to do a tedious task,” she noted. She’s analyzing the images with a standard computer vision program and also with a custom AI package to see if the latter could be more reliable, once it’s sufficiently trained.
“I don't know if either of them may be good enough. Computer vision by itself may be good enough. AI by itself may be good enough. It's possible that combining these foundations may lead to a better result, but I can't tell you which one it’s going to be until we actually finish the work. But these are the approaches that we'll take,” Daher added.
There are potential obstacles. One is depth perception: it’s difficult for humans and software alike to understand what’s happening when the barium stickers, placed on both legs, are viewed in a video of a patient walking. Deciding on the best AI model is another challenge, as is determining how much data is really needed for training the model. Finally, Daher and her students need to present the results in a way that can be understood by healthcare professionals, not just by computer programmers. For that dilemma, they’re going to create their own interface.
Daher is co-principal investigator of the $40,000 grant, funded by the American Orthopaedic Foot & Ankle Society. The principal investigator is David Paglia, an assistant professor at Rutgers Medical School, who received $10,000. Paglia and his own team designed experiments to establish a baseline for error in the system. He also works with Maikos on interpreting the results and working toward clinical adoption.