TY - GEN
T1 - Virtual Reality for Evaluating Prosthetic Hand Control Strategies
T2 - 43rd Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC 2021
AU - Xie, Jason
AU - Hu, Xiaogang
N1 - Publisher Copyright:
© 2021 IEEE.
PY - 2021
Y1 - 2021
N2 - Improving prosthetic hand functionality is critical in reducing abandonment rates and improving the amputee's quality of life. Techniques such as joint force estimation and gesture recognition using myoelectric signals could enable more realistic control of the prosthetic hand. To accelerate the translation of these advanced control strategies from lab to clinic, We created a virtual prosthetic control environment that enables rich user interactions and dexterity evaluation. The virtual environment is made of two parts, namely the Unity scene for rendering and user interaction, and a Python back-end to support accurate physics simulation and communication with control algorithms. By utilizing the built-in tracking capabilities of a virtual reality headset, the user can visualize and manipulate a virtual hand without additional motion tracking setups. In the virtual environment, we demonstrate actuation of the prosthetic hand through decoded EMG signal streaming, hand tracking, and the use of a VR controller. By providing a flexible platform to investigate different control modalities, we believe that our virtual environment will allow for faster experimentation and further progress in clinical translation.
AB - Improving prosthetic hand functionality is critical in reducing abandonment rates and improving the amputee's quality of life. Techniques such as joint force estimation and gesture recognition using myoelectric signals could enable more realistic control of the prosthetic hand. To accelerate the translation of these advanced control strategies from lab to clinic, We created a virtual prosthetic control environment that enables rich user interactions and dexterity evaluation. The virtual environment is made of two parts, namely the Unity scene for rendering and user interaction, and a Python back-end to support accurate physics simulation and communication with control algorithms. By utilizing the built-in tracking capabilities of a virtual reality headset, the user can visualize and manipulate a virtual hand without additional motion tracking setups. In the virtual environment, we demonstrate actuation of the prosthetic hand through decoded EMG signal streaming, hand tracking, and the use of a VR controller. By providing a flexible platform to investigate different control modalities, we believe that our virtual environment will allow for faster experimentation and further progress in clinical translation.
UR - http://www.scopus.com/inward/record.url?scp=85122504030&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85122504030&partnerID=8YFLogxK
U2 - 10.1109/EMBC46164.2021.9630555
DO - 10.1109/EMBC46164.2021.9630555
M3 - Conference contribution
C2 - 34892545
AN - SCOPUS:85122504030
T3 - Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS
SP - 6263
EP - 6266
BT - 43rd Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC 2021
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 1 November 2021 through 5 November 2021
ER -