Abstract
Objectives: Robotic prosthetic limbs promise to replace mechanical function of lost biological extremities and restore amputees' capacity of moving and interacting with the environment. Despite recent advances in biocompatible electrodes, surgical procedures, and mechatronics, the impact of current solutions is hampered by the lack of intuitive and robust man-machine interfaces.
Approach: Based on authors' developments, this work presents a biomimetic interface that synthetizes the musculoskeletal function of an individual's phantom limb as controlled by neural surrogates, i.e. electromyography-derived neural activations. With respect to current approaches based on machine learning, our method employs explicit representations of the musculoskeletal system to reduce the space of feasible solutions in the translation of electromyograms into prosthesis control commands. Electromyograms are mapped onto mechanical forces that belong to a subspace contained within the broader operational space of an individual's musculoskeletal system.
Results: Our results show that this constraint makes the approach applicable to real-world scenarios and robust to movement artefacts. This stems from the fact that any control command must always exist within the musculoskeletal model operational space and be therefore physiologically plausible. The approach was effective both on intact-limbed individuals and a transradial amputee displaying robust online control of multi-functional prostheses across a large repertoire of challenging tasks.
Significance: The development and translation of man-machine interfaces that account for an individual's neuromusculoskeletal system creates unprecedented opportunities to understand how disrupted neuro-mechanical processes can be restored or replaced via biomimetic wearable assistive technologies.
Approach: Based on authors' developments, this work presents a biomimetic interface that synthetizes the musculoskeletal function of an individual's phantom limb as controlled by neural surrogates, i.e. electromyography-derived neural activations. With respect to current approaches based on machine learning, our method employs explicit representations of the musculoskeletal system to reduce the space of feasible solutions in the translation of electromyograms into prosthesis control commands. Electromyograms are mapped onto mechanical forces that belong to a subspace contained within the broader operational space of an individual's musculoskeletal system.
Results: Our results show that this constraint makes the approach applicable to real-world scenarios and robust to movement artefacts. This stems from the fact that any control command must always exist within the musculoskeletal model operational space and be therefore physiologically plausible. The approach was effective both on intact-limbed individuals and a transradial amputee displaying robust online control of multi-functional prostheses across a large repertoire of challenging tasks.
Significance: The development and translation of man-machine interfaces that account for an individual's neuromusculoskeletal system creates unprecedented opportunities to understand how disrupted neuro-mechanical processes can be restored or replaced via biomimetic wearable assistive technologies.
Original language | English |
---|---|
Article number | 066026 |
Journal | Journal of neural engineering |
Volume | 15 |
Issue number | 6 |
DOIs | |
Publication status | Published - 22 Oct 2018 |
Keywords
- Prosthesis
- EMG
- Neuromusculoskeletal modeling
- Neuromuscular control
- Amputees