Project Details
Description
The broader impact/commercial potential of this I-Corps project is the development of a smart ring with integrated sensors for applications in extended reality and smart healthcare. The proposed technology may be used to monitor patients’ health biometric data including heart rate, oxygen saturation, blood pressure, blood glucose, blood alcohol concentration and indoor location remotely. This may improve early detection of health issues, track and improve health and wellness, and reduce human resources devoted to the monitoring of patients. In addition, the proposed smart ring technology may be used to help deaf and hard of hearing individuals communicate with hearing individuals. The proposed technology is designed with an integrated sign language recognition (SLR) system. Currently, the population of deaf and hard of hearing individuals is about 10 million in the US and 466 million globally, however, an efficient communication tool does not exist. The proposed technology could serve an important need for the deaf and hard of hearing community.This I-Corps project is based on the development of a 3D finger motion tracking smart ring. The proposed technology is designed by integrating low-cost, off-the-shelf electronic components using system-on-a-chip (SoC) architecture into a flexible printed circuit board that can bend in the form factor of a ring. Inertial measurement unit (IMU) and photoplethysmography (PPG) sensors are embedded for enabling motion analytics such as sign language recognition and healthcare applications such as heart rate, oxygen saturation, blood pressure, blood alcohol concentration and blood glucose. The hardware is enclosed in a 3D-printed waterproof case that allows smooth contact with the skin. Overall, the form factor weighs about 2.5 g with a week of battery life for comfortable and long-term wearing including while sleeping and swimming. Results show the feasibility of tracking 24 degrees of freedom of finger motion using sparse rings placed on a few fingers. This is an unconstrained problem without well-formed equations because the sensors are not placed on every finger or joint. Deep learning techniques are utilized to learn the mapping between sensor data and finger motion that exploit correlations between motion of different fingers to solve the unconstrained problem.This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
Status | Finished |
---|---|
Effective start/end date | 9/1/23 → 8/31/24 |
Funding
- National Science Foundation: $50,000.00
Fingerprint
Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.