TY - JOUR
T1 - Robust neural decoding for dexterous control of robotic hand kinematics
AU - Fan, Jiahao
AU - Vargas, Luis
AU - Kamper, Derek G.
AU - Hu, Xiaogang
N1 - Publisher Copyright:
© 2023
PY - 2023/8
Y1 - 2023/8
N2 - Background: Manual dexterity is a fundamental motor skill that allows us to perform complex daily tasks. Neuromuscular injuries, however, can lead to the loss of hand dexterity. Although numerous advanced assistive robotic hands have been developed, we still lack dexterous and continuous control of multiple degrees of freedom in real-time. In this study, we developed an efficient and robust neural decoding approach that can continuously decode intended finger dynamic movements for real-time control of a prosthetic hand. Methods: High-density electromyogram (HD-EMG) signals were obtained from the extrinsic finger flexor and extensor muscles, while participants performed either single-finger or multi-finger flexion-extension movements. We implemented a deep learning-based neural network approach to learn the mapping from HD-EMG features to finger-specific population motoneuron firing frequency (i.e., neural-drive signals). The neural-drive signals reflected motor commands specific to individual fingers. The predicted neural-drive signals were then used to continuously control the fingers (index, middle, and ring) of a prosthetic hand in real-time. Results: Our developed neural-drive decoder could consistently and accurately predict joint angles with significantly lower prediction errors across single-finger and multi-finger tasks, compared with a deep learning model directly trained on finger force signals and the conventional EMG-amplitude estimate. The decoder performance was stable over time and was robust to variations of the EMG signals. The decoder also demonstrated a substantially better finger separation with minimal predicted error of joint angle in the unintended fingers. Conclusions: This neural decoding technique offers a novel and efficient neural-machine interface that can consistently predict robotic finger kinematics with high accuracy, which can enable dexterous control of assistive robotic hands.
AB - Background: Manual dexterity is a fundamental motor skill that allows us to perform complex daily tasks. Neuromuscular injuries, however, can lead to the loss of hand dexterity. Although numerous advanced assistive robotic hands have been developed, we still lack dexterous and continuous control of multiple degrees of freedom in real-time. In this study, we developed an efficient and robust neural decoding approach that can continuously decode intended finger dynamic movements for real-time control of a prosthetic hand. Methods: High-density electromyogram (HD-EMG) signals were obtained from the extrinsic finger flexor and extensor muscles, while participants performed either single-finger or multi-finger flexion-extension movements. We implemented a deep learning-based neural network approach to learn the mapping from HD-EMG features to finger-specific population motoneuron firing frequency (i.e., neural-drive signals). The neural-drive signals reflected motor commands specific to individual fingers. The predicted neural-drive signals were then used to continuously control the fingers (index, middle, and ring) of a prosthetic hand in real-time. Results: Our developed neural-drive decoder could consistently and accurately predict joint angles with significantly lower prediction errors across single-finger and multi-finger tasks, compared with a deep learning model directly trained on finger force signals and the conventional EMG-amplitude estimate. The decoder performance was stable over time and was robust to variations of the EMG signals. The decoder also demonstrated a substantially better finger separation with minimal predicted error of joint angle in the unintended fingers. Conclusions: This neural decoding technique offers a novel and efficient neural-machine interface that can consistently predict robotic finger kinematics with high accuracy, which can enable dexterous control of assistive robotic hands.
UR - http://www.scopus.com/inward/record.url?scp=85161003377&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85161003377&partnerID=8YFLogxK
U2 - 10.1016/j.compbiomed.2023.107139
DO - 10.1016/j.compbiomed.2023.107139
M3 - Article
C2 - 37301095
AN - SCOPUS:85161003377
SN - 0010-4825
VL - 162
JO - Computers in Biology and Medicine
JF - Computers in Biology and Medicine
M1 - 107139
ER -