TY - GEN
T1 - Application Informed Motion Signal Processing for Finger Motion Tracking Using Wearable Sensors
AU - Liu, Yilin
AU - Jiang, Fengyang
AU - Gowda, Mahanth
N1 - Publisher Copyright:
© 2020 IEEE.
PY - 2020/5
Y1 - 2020/5
N2 - Finger motion tracking has a number of applications in user-interfaces, sports analytics, medical rehabilitation and sign language translation. This paper presents a system called FinGTrAC that shows the feasibility of fine grained finger gesture tracking using low intrusive wearable sensor platform (smart-ring worn on the index finger and a smart-watch worn on the wrist). Such sparse sensors are convenient to wear but cannot track all fingers and hence provide under-constrained information. However application specific context can fill the gap in sparse sensing and improve the accuracy of gesture classification. This paper shows the feasibility of exploiting such context in an application of American Sign Language (ASL) translation. Non-trivial challenges arise due to noisy sensor data, variations in gesture performance across users and the inability to capture data from all fingers. FinGTrAC exploits a number of opportunities in data preprocessing, filtering, pattern matching, context of an ASL sentence to systematically fuse the available sensory information into a Bayesian filtering framework. Culminating into the design of a Hidden Markov Model, a Viterbi decoding scheme is designed to detect finger gestures and the corresponding ASL sentences in real time. Extensive evaluation on 10 users shows a detection accuracy of 94.2% for 100 most frequently used ASL finger gestures over different sentences.
AB - Finger motion tracking has a number of applications in user-interfaces, sports analytics, medical rehabilitation and sign language translation. This paper presents a system called FinGTrAC that shows the feasibility of fine grained finger gesture tracking using low intrusive wearable sensor platform (smart-ring worn on the index finger and a smart-watch worn on the wrist). Such sparse sensors are convenient to wear but cannot track all fingers and hence provide under-constrained information. However application specific context can fill the gap in sparse sensing and improve the accuracy of gesture classification. This paper shows the feasibility of exploiting such context in an application of American Sign Language (ASL) translation. Non-trivial challenges arise due to noisy sensor data, variations in gesture performance across users and the inability to capture data from all fingers. FinGTrAC exploits a number of opportunities in data preprocessing, filtering, pattern matching, context of an ASL sentence to systematically fuse the available sensory information into a Bayesian filtering framework. Culminating into the design of a Hidden Markov Model, a Viterbi decoding scheme is designed to detect finger gestures and the corresponding ASL sentences in real time. Extensive evaluation on 10 users shows a detection accuracy of 94.2% for 100 most frequently used ASL finger gestures over different sentences.
UR - http://www.scopus.com/inward/record.url?scp=85089210410&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85089210410&partnerID=8YFLogxK
U2 - 10.1109/ICASSP40776.2020.9053466
DO - 10.1109/ICASSP40776.2020.9053466
M3 - Conference contribution
AN - SCOPUS:85089210410
T3 - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
SP - 8334
EP - 8338
BT - 2020 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2020 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2020 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2020
Y2 - 4 May 2020 through 8 May 2020
ER -