Designing a Real-Time ASL Auto-Recognition Tool Using Deep Learning

Pakhi Agarwal, Jian Liao, Simon Hooper, Rayne Sperling

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

American Sign Language comprises more than 500 signs and gestures. Assessing sign-language for each individual learner is often time-consuming, especially when instructors assess learners' work regularly. In this project, we describe the design of an ASL auto-recognition tool, which can evaluate students' signed videos automatically and hence help with self-evaluation. This tool uses deep learning methods to recognize ASL gestures and provides immediate feedback to ASL learners. The results suggest that the accuracy of the model is acceptable in conditions with adequate lighting. Hence, the tool can potentially be used to support instructors with sign language assessment.

Original languageEnglish (US)
Title of host publicationProceedings - 2021 10th International Conference of Educational Innovation through Technology, EITT 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages163-166
Number of pages4
ISBN (Electronic)9781665427579
DOIs
StatePublished - 2021
Event10th International Conference of Educational Innovation through Technology, EITT 2021 - Virtual, Chongqing, China
Duration: Dec 16 2021Dec 20 2021

Publication series

NameProceedings - 2021 10th International Conference of Educational Innovation through Technology, EITT 2021

Conference

Conference10th International Conference of Educational Innovation through Technology, EITT 2021
Country/TerritoryChina
CityVirtual, Chongqing
Period12/16/2112/20/21

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Computer Science Applications
  • Information Systems and Management
  • Education

Fingerprint

Dive into the research topics of 'Designing a Real-Time ASL Auto-Recognition Tool Using Deep Learning'. Together they form a unique fingerprint.

Cite this