TY - JOUR
T1 - Concurrent Prediction of Finger Forces Based on Source Separation and Classification of Neuron Discharge Information
AU - Zheng, Yang
AU - Hu, Xiaogang
N1 - Publisher Copyright:
© 2021 World Scientific Publishing Company.
PY - 2021/6
Y1 - 2021/6
N2 - A reliable neural-machine interface is essential for humans to intuitively interact with advanced robotic hands in an unconstrained environment. Existing neural decoding approaches utilize either discrete hand gesture-based pattern recognition or continuous force decoding with one finger at a time. We developed a neural decoding technique that allowed continuous and concurrent prediction of forces of different fingers based on spinal motoneuron firing information. High-density skin-surface electromyogram (HD-EMG) signals of finger extensor muscle were recorded, while human participants produced isometric flexion forces in a dexterous manner (i.e. produced varying forces using either a single finger or multiple fingers concurrently). Motoneuron firing information was extracted from the EMG signals using a blind source separation technique, and each identified neuron was further classified to be associated with a given finger. The forces of individual fingers were then predicted concurrently by utilizing the corresponding motoneuron pool firing frequency of individual fingers. Compared with conventional approaches, our technique led to better prediction performances, i.e. a higher correlation (0.71 ± 0.11 versus 0.61 ± 0.09), a lower prediction error (5.88 ± 1.34% MVC versus 7.56 ± 1.60% MVC), and a higher accuracy in finger state (rest/active) prediction (88.10 ± 4.65% versus 80.21 ± 4.32%). Our decoding method demonstrated the possibility of classifying motoneurons for different fingers, which significantly alleviated the cross-talk issue of EMG recordings from neighboring hand muscles, and allowed the decoding of finger forces individually and concurrently. The outcomes offered a robust neural-machine interface that could allow users to intuitively control robotic hands in a dexterous manner.
AB - A reliable neural-machine interface is essential for humans to intuitively interact with advanced robotic hands in an unconstrained environment. Existing neural decoding approaches utilize either discrete hand gesture-based pattern recognition or continuous force decoding with one finger at a time. We developed a neural decoding technique that allowed continuous and concurrent prediction of forces of different fingers based on spinal motoneuron firing information. High-density skin-surface electromyogram (HD-EMG) signals of finger extensor muscle were recorded, while human participants produced isometric flexion forces in a dexterous manner (i.e. produced varying forces using either a single finger or multiple fingers concurrently). Motoneuron firing information was extracted from the EMG signals using a blind source separation technique, and each identified neuron was further classified to be associated with a given finger. The forces of individual fingers were then predicted concurrently by utilizing the corresponding motoneuron pool firing frequency of individual fingers. Compared with conventional approaches, our technique led to better prediction performances, i.e. a higher correlation (0.71 ± 0.11 versus 0.61 ± 0.09), a lower prediction error (5.88 ± 1.34% MVC versus 7.56 ± 1.60% MVC), and a higher accuracy in finger state (rest/active) prediction (88.10 ± 4.65% versus 80.21 ± 4.32%). Our decoding method demonstrated the possibility of classifying motoneurons for different fingers, which significantly alleviated the cross-talk issue of EMG recordings from neighboring hand muscles, and allowed the decoding of finger forces individually and concurrently. The outcomes offered a robust neural-machine interface that could allow users to intuitively control robotic hands in a dexterous manner.
UR - http://www.scopus.com/inward/record.url?scp=85100580341&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85100580341&partnerID=8YFLogxK
U2 - 10.1142/S0129065721500106
DO - 10.1142/S0129065721500106
M3 - Article
C2 - 33541251
AN - SCOPUS:85100580341
SN - 0129-0657
VL - 31
JO - International journal of neural systems
JF - International journal of neural systems
IS - 6
M1 - 2150010
ER -