TY - GEN
T1 - Concurrent Decoding of Finger Kinematic and Kinetic Variables based on Motor Unit Discharges
AU - Roy, Rinku
AU - Kamper, Derek G.
AU - Hu, Xiaogang
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - A reliable and functional neural interface is necessary to control individual finger movements of assistive robotic hands. Non-invasive surface electromyogram (sEMG) can be used to predict fingertip forces and joint kinematics continuously. However, concurrent prediction of kinematic and dynamic variables in a continuous manner remains a challenge. The purpose of this study was to develop a neural decoding algorithm capable of concurrent prediction of fingertip forces and finger dynamic movements. High-density electromyogram (HD-EMG) signal was collected during finger flexion tasks using either the index or middle finger: isometric, dynamic, and combined tasks. Based on the data obtained from the two first tasks, motor unit (MU) firing activities associated with individual fingers and tasks were derived using a blind source separation method. MUs assigned to the same tasks and fingers were pooled together to form MU pools. Twenty MUs were then refined using EMG data of a combined trial. The refined MUs were applied to a testing dataset of the combined task, and were divided into five groups based on the similarity of firing patterns, and the populational discharge frequency was determined for each group. Using the summated firing frequencies obtained from five groups of MUs in a multivariate linear regression model, fingertip forces and joint angles were derived concurrently. The decoding performance was compared to the conventional EMG amplitude-based approach. In both joint angles and fingertip forces, MU-based approach outperformed the EMG amplitude approach with a smaller prediction error (Force: 5.36±0.47 vs 6.89±0.39 %MVC, Joint Angle: 5.0±0.27° vs 12.76±0.40°) and a higher correlation (Force: 0.87±0.05 vs 0.73±0.1, Joint Angle: 0.92±0.05 vs 0.45±0.05) between the predicted and recorded motor output. The outcomes provide a functional and accurate neural interface for continuous control of assistive robotic hands.
AB - A reliable and functional neural interface is necessary to control individual finger movements of assistive robotic hands. Non-invasive surface electromyogram (sEMG) can be used to predict fingertip forces and joint kinematics continuously. However, concurrent prediction of kinematic and dynamic variables in a continuous manner remains a challenge. The purpose of this study was to develop a neural decoding algorithm capable of concurrent prediction of fingertip forces and finger dynamic movements. High-density electromyogram (HD-EMG) signal was collected during finger flexion tasks using either the index or middle finger: isometric, dynamic, and combined tasks. Based on the data obtained from the two first tasks, motor unit (MU) firing activities associated with individual fingers and tasks were derived using a blind source separation method. MUs assigned to the same tasks and fingers were pooled together to form MU pools. Twenty MUs were then refined using EMG data of a combined trial. The refined MUs were applied to a testing dataset of the combined task, and were divided into five groups based on the similarity of firing patterns, and the populational discharge frequency was determined for each group. Using the summated firing frequencies obtained from five groups of MUs in a multivariate linear regression model, fingertip forces and joint angles were derived concurrently. The decoding performance was compared to the conventional EMG amplitude-based approach. In both joint angles and fingertip forces, MU-based approach outperformed the EMG amplitude approach with a smaller prediction error (Force: 5.36±0.47 vs 6.89±0.39 %MVC, Joint Angle: 5.0±0.27° vs 12.76±0.40°) and a higher correlation (Force: 0.87±0.05 vs 0.73±0.1, Joint Angle: 0.92±0.05 vs 0.45±0.05) between the predicted and recorded motor output. The outcomes provide a functional and accurate neural interface for continuous control of assistive robotic hands.
UR - http://www.scopus.com/inward/record.url?scp=85146288077&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85146288077&partnerID=8YFLogxK
U2 - 10.1109/ICHMS56717.2022.9980636
DO - 10.1109/ICHMS56717.2022.9980636
M3 - Conference contribution
AN - SCOPUS:85146288077
T3 - Proceedings of the 2022 IEEE International Conference on Human-Machine Systems, ICHMS 2022
BT - Proceedings of the 2022 IEEE International Conference on Human-Machine Systems, ICHMS 2022
A2 - Kaber, David
A2 - Guerrieri, Antonio
A2 - Fortino, Giancarlo
A2 - Nurnberger, Andreas
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 3rd IEEE International Conference on Human-Machine Systems, ICHMS 2022
Y2 - 17 November 2022 through 19 November 2022
ER -