Abstract
GOAL AND AIMS: Commonly used actigraphy algorithms are designed to operate within a known in-bed interval. However, in free-living scenarios this interval is often unknown. We trained and evaluated a sleep/wake classifier that operates on actigraphy over ∼24-hour intervals, without knowledge of in-bed timing.
FOCUS TECHNOLOGY: Actigraphy counts from ActiWatch Spectrum devices.
REFERENCE TECHNOLOGY: Sleep staging derived from polysomnography, supplemented by observation of wakefulness outside of the staged interval. Classifications from the Oakley actigraphy algorithm were additionally used as performance reference.
SAMPLE: Adults, sleeping in either a home or laboratory environment.
DESIGN: Machine learning was used to train and evaluate a sleep/wake classifier in a supervised learning paradigm. The classifier is a temporal convolutional network, a form of deep neural network.
CORE ANALYTICS: Performance was evaluated across ∼24 hours, and additionally restricted to only in-bed intervals, both in terms of epoch-by-epoch performance, and the discrepancy of summary statistics within the intervals.
ADDITIONAL ANALYTICS AND EXPLORATORY ANALYSES: Performance of the trained model applied to the Multi-Ethnic Study of Atherosclerosis dataset.
CORE OUTCOMES: Over ∼24 hours, the temporal convolutional network classifier produced the same or better performance as the Oakley classifier on all measures tested. When restricting analysis to the in-bed interval, the temporal convolutional network remained favorable on several metrics.
IMPORTANT SUPPLEMENTAL OUTCOMES: Performance decreased on the Multi-Ethnic Study of Atherosclerosis dataset, especially when restricting analysis to the in-bed interval.
CORE CONCLUSION: A classifier using data labeled over ∼24-hour intervals allows for the continuous classification of sleep/wake without knowledge of in-bed intervals. Further development should focus on improving generalization performance.
Original language | English (US) |
---|---|
Journal | Sleep health |
DOIs | |
State | E-pub ahead of print - Aug 10 2023 |