AI vs. Traditional Below-Knee Socket Plaster Casts—Which Design is the Best?
Reading Time: 4 minutes
The development of prosthetic sockets is a meticulous, collaborative endeavor that often requires several check sockets to achieve an optimal fit for the user. This traditional method has been plagued by challenges, such as variations in comfort and fit depending on the prosthetist’s skill level. However, recent advancements in artificial intelligence (AI) could improve comfort and create a more standardized design process.
Researchers explored the possibility of creating AI algorithms to improve the shaping of prosthetic sockets, specifically for below-knee prostheses. Their goal was to determine whether AI could reduce reliance on the individual abilities of prosthetists, thereby establishing a more uniform approach to socket creation. The results of this study were published in the Archives of Physical Medicine and Rehabilitation in September 2024.
The study
The research was conducted in two phases: first, an AI algorithm was created to predict the shapes of prosthetic sockets, and second, these AI-designed sockets were evaluated against those crafted manually. The study took place at the rehabilitation department of Radboud University Medical Center located in Nijmegen, the Netherlands.
To develop the AI algorithm, data was collected retrospectively from 116 patients associated with OIM Orthopedie, a Dutch orthopedic firm. The algorithm was then tested on a sample of 10 randomly chosen participants from Papenburg Orthopedie.
During the testing phase, participants were presented with two variations of prosthetic sockets: one that was manually designed and measured (MMD) by a seasoned prosthetist with over 32 years of experience and the other that was digitally measured and standardized (DMSD) based on predictions from the AI algorithm. Both types of sockets were produced using the same 3D printer and materials.
The study design included three appointments for participants to assess, align, and evaluate their prosthetic sockets.
Following these evaluations, they conducted a blind test at home for one week, unaware of which socket they used. Only the researchers and the prosthetist knew which socket was MMD or DMSD.
The findings showed that the prosthetic sockets predicted by the AI algorithm differed from the actual designs by an average of 2.51 mm. While two of the 10 DMSD prosthetic sockets were found unsuitable due to insufficient socket volume, the remaining eight were reported to be more comfortable than their MMD counterparts.
The algorithm’s limitations
While the results showed AI's potential, there’s still room for improvement. The accuracy of an AI algorithm is influenced by the volume and quality of the data it uses.
For the study, the data was trained on a limited dataset that includes a narrow range of residual limb lengths, thicknesses, and types, which may have hindered its ability to generalize across diverse residual limb variations. This limitation was evident in the two DSMD prosthetic sockets that were deemed unsuitable for use due to their small sizes. These particular limbs were narrower than those of other participants in the second phase of the study. So, it’s crucial to incorporate a wider variety of residual limb sizes and shapes in the training dataset.
Furthermore, the quality of the dataset plays a vital role. Ideally, it should include scans of well-structured prosthetic sockets used by individuals who have reported high satisfaction levels. Gathering feedback on user satisfaction could be instrumental in training the algorithm to recognize important features of an effective prosthetic socket design, ultimately improving the fit and performance.
The bottom line
The findings of this research highlight the promise of AI algorithms in standardizing the design process for below-knee prosthetic sockets. By reducing reliance on the skill level of prosthetists, this approach could result in a quicker manufacturing process and provide a more uniform fit for users of below-knee prostheses.
However, additional research and improvements are necessary to optimize the performance of the AI algorithms.
What do you think of this development?