Finger Gesture Tracking for Interactive Applications: A Pilot Study with Sign Languages

Yilin Liu, Fengyang Jiang, Mahanth Gowda

Research output: Contribution to journalArticlepeer-review

17 Scopus citations


This paper presents FinGTrAC, a system that shows the feasibility of fine grained finger gesture tracking using low intrusive wearable sensor platform (smart-ring worn on the index finger and a smart-watch worn on the wrist). The key contribution is in scaling up gesture recognition to hundreds of gestures while using only a sparse wearable sensor set where prior works have been able to only detect tens of hand gestures. Such sparse sensors are convenient to wear but cannot track all fingers and hence provide under-constrained information. However application specific context can fill the gap in sparse sensing and improve the accuracy of gesture classification. Rich context exists in a number of applications such as user-interfaces, sports analytics, medical rehabilitation, sign language translation etc. This paper shows the feasibility of exploiting such context in an application of American Sign Language (ASL) translation. Noisy sensor data, variations in gesture performance across users and the inability to capture data from all fingers introduce non-trivial challenges. FinGTrAC exploits a number of opportunities in data preprocessing, filtering, pattern matching, context of an ASL sentence to systematically fuse the available sensory information into a Bayesian filtering framework. Culminating into the design of a Hidden Markov Model, a Viterbi decoding scheme is designed to detect finger gestures and the corresponding ASL sentences in real time. Extensive evaluation on 10 users shows a recognition accuracy of 94.2% for 100 most frequently used ASL finger gestures over different sentences. When the size of the dictionary is extended to 200 words, the accuracy is degrades gracefully to 90% thus indicating the robustness and scalability of the multi-stage optimization framework.

Original languageEnglish (US)
Article number112
JournalProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Issue number3
StatePublished - Sep 4 2020

All Science Journal Classification (ASJC) codes

  • Human-Computer Interaction
  • Hardware and Architecture
  • Computer Networks and Communications


Dive into the research topics of 'Finger Gesture Tracking for Interactive Applications: A Pilot Study with Sign Languages'. Together they form a unique fingerprint.

Cite this