Improving Neural Control of Upper-Limb Prostheses
Existing upper extremity prostheses that rely on a human-computer interface typically have restricted functionality and pose a significant cognitive demand for their users. These limitations are often responsible for the high rates of abandonment of prosthetic devices.
Researchers from the MIT Media Lab are taking steps to address this issue by using machine learning and human-computer interaction to enhance human physical capability and revolutionize mental well-being.
Harnessing residual limb signals
This new project by Michael Fernandez of the K. Lisa Yang Center for Bionics, Junqing Qiao of the MIT Department of Mechanical Engineering, and Hugh Herr of the MIT Department of Media Arts and Sciences aims to transform upper extremity prosthetics by offering more natural movement by harnessing residual limb signals. Their goal is to reduce device abandonment, opening new possibilities for individuals with limb differences.
The researchers are investigating different methods to translate the prosthetic user's intentions into the resulting movement of the prosthetic device. They are examining signals generated by the residual limb muscular structure in recipients of the MIT-developed Agonist-antagonist Myoneural Interface (AMI) procedure, as well as traditional amputation. These musculature signals are then used as control signals for the prosthetic arm.
So far, the study has found that using these biological signals for control grants the users intuitive control of multiple degrees of freedom. Ultimately, advanced controllers could provide control of a prosthesis with native biomechanics.
As of this writing, the researchers are testing their findings. The video below shows how one of the subjects, Dave, can operate a pair of scissors and cut a piece of paper using a neurally controlled prosthetic arm.