Abstract
Powered leg prostheses have been introduced to enhance mobility and reduce energy consumption during walking; however, they often fail to meet user expectations. While one might assume that robotic systems should assist users, the technology is not advanced enough to understand and respond to user needs in a natural way, as a biological leg would. As a consequence of existing technologies, patients adopted non-natural compensatory movements that require higher energy expenditure and result in problems such as trauma disorders, instability, and inefficient walking that affect their quality of life.
Furthermore, the solutions developed are either limited in the movement they could perform or too complex (many parameters to be tuned), hindering a broader range of movement.
State-of-the-art prosthetic devices are controlled by using a simple set of rules to activate predefined changes in the damping and/or actuation of the knee and ankle joints, while the activation of biological joints is based on continuous modulation of impedance and active forces.
Our investigations focused on the development and evaluation of an emerging methodology to control wearable robotics: the use of neuromechanical modeling as a mid-level controller.
A neuromechanical model is a digital model of the body that can simulate the dynamics of muscle force production, including neuro-muscular commands. Compared to pre-existing prosthetic controllers, it would enable mimicking the biomechanical function of an intact limb, decoding plausible torque commands to the prosthetic according to the subject-specific requirements across various movement conditions.
We investigated two approaches to generate feedforward control inputs to the model: from electromyography (EMG) signals, to improve voluntary control from the user allowing for non-steady movements, that are not accounted for by other models, and from the synergy model which can estimate the activity of leg muscles for different walking speeds and elevations, facilitating walking in various conditions with minimal sensor dependency and cognitive effort.
Furthermore, the solutions developed are either limited in the movement they could perform or too complex (many parameters to be tuned), hindering a broader range of movement.
State-of-the-art prosthetic devices are controlled by using a simple set of rules to activate predefined changes in the damping and/or actuation of the knee and ankle joints, while the activation of biological joints is based on continuous modulation of impedance and active forces.
Our investigations focused on the development and evaluation of an emerging methodology to control wearable robotics: the use of neuromechanical modeling as a mid-level controller.
A neuromechanical model is a digital model of the body that can simulate the dynamics of muscle force production, including neuro-muscular commands. Compared to pre-existing prosthetic controllers, it would enable mimicking the biomechanical function of an intact limb, decoding plausible torque commands to the prosthetic according to the subject-specific requirements across various movement conditions.
We investigated two approaches to generate feedforward control inputs to the model: from electromyography (EMG) signals, to improve voluntary control from the user allowing for non-steady movements, that are not accounted for by other models, and from the synergy model which can estimate the activity of leg muscles for different walking speeds and elevations, facilitating walking in various conditions with minimal sensor dependency and cognitive effort.
| Original language | English |
|---|---|
| Qualification | Doctor of Philosophy |
| Awarding Institution |
|
| Supervisors/Advisors |
|
| Award date | 4 Dec 2025 |
| Place of Publication | Enschede |
| Publisher | |
| Print ISBNs | 978-90-365-6801-2 |
| Electronic ISBNs | 978-90-365-6802-9 |
| DOIs | |
| Publication status | Published - 4 Dec 2025 |