Finger motion tracking has a number of applications in user-interfaces, sports analytics, medical rehabilitation and sign language translation. This paper presents a system called FinGTrAC that shows the feasibility of fine grained finger gesture tracking using low intrusive wearable sensor platform (smart-ring worn on the index finger and a smart-watch worn on the wrist). Such sparse sensors are convenient to wear but cannot track all fingers and hence provide under-constrained information. However application specific context can fill the gap in sparse sensing and improve the accuracy of gesture classification. This paper shows the feasibility of exploiting such context in an application of American Sign Language (ASL) translation. Non-trivial challenges arise due to noisy sensor data, variations in gesture performance across users and the inability to capture data from all fingers. FinGTrAC exploits a number of opportunities in data preprocessing, filtering, pattern matching, context of an ASL sentence to systematically fuse the available sensory information into a Bayesian filtering framework. Culminating into the design of a Hidden Markov Model, a Viterbi decoding scheme is designed to detect finger gestures and the corresponding ASL sentences in real time. Extensive evaluation on 10 users shows a detection accuracy of 94.2% for 100 most frequently used ASL finger gestures over different sentences.