TY - GEN
T1 - TrustSleepNet
T2 - 2022 IEEE-EMBS International Conference on Biomedical and Health Informatics, BHI 2022
AU - Huang, Guanjie
AU - Ma, Fenglong
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - Correctly classifying different sleep stages is a critical and prerequisite step in diagnosing sleep-related issues. In practice, the clinical experts must manually review the polysomnography (PSG) recordings to classify sleep stages. Such a procedure is time-consuming, laborious, and potentially prone to human subjective errors. Deep learning-based methods have been successfully adopted for automatically classifying sleep stages in recent years. However, they cannot simply say 'I do not know' when they are uncertain in their predictions, which may easily create significant risk in clinical applications, despite their good performance. To address this issue, we propose a deep model, named TrustSleepNet, which contains evidential learning and cross-modality attention modules. Evidential learning predicts the probability density of the classes, which can learn an uncertainty score and make the prediction trustable in real-world clinical applications. Cross-modality attention adaptively fuses multimodal PSG data by enhancing the significant ones and suppressing irrelevant ones. Experimental results demonstrate that TrustSleepNet outperforms state-of-the-art benchmark methods, and the uncertainty score makes the prediction more trustable and reliable.
AB - Correctly classifying different sleep stages is a critical and prerequisite step in diagnosing sleep-related issues. In practice, the clinical experts must manually review the polysomnography (PSG) recordings to classify sleep stages. Such a procedure is time-consuming, laborious, and potentially prone to human subjective errors. Deep learning-based methods have been successfully adopted for automatically classifying sleep stages in recent years. However, they cannot simply say 'I do not know' when they are uncertain in their predictions, which may easily create significant risk in clinical applications, despite their good performance. To address this issue, we propose a deep model, named TrustSleepNet, which contains evidential learning and cross-modality attention modules. Evidential learning predicts the probability density of the classes, which can learn an uncertainty score and make the prediction trustable in real-world clinical applications. Cross-modality attention adaptively fuses multimodal PSG data by enhancing the significant ones and suppressing irrelevant ones. Experimental results demonstrate that TrustSleepNet outperforms state-of-the-art benchmark methods, and the uncertainty score makes the prediction more trustable and reliable.
UR - http://www.scopus.com/inward/record.url?scp=85143058067&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85143058067&partnerID=8YFLogxK
U2 - 10.1109/BHI56158.2022.9926875
DO - 10.1109/BHI56158.2022.9926875
M3 - Conference contribution
AN - SCOPUS:85143058067
T3 - BHI-BSN 2022 - IEEE-EMBS International Conference on Biomedical and Health Informatics and IEEE-EMBS International Conference on Wearable and Implantable Body Sensor Networks, Symposium Proceedings
BT - BHI-BSN 2022 - IEEE-EMBS International Conference on Biomedical and Health Informatics and IEEE-EMBS International Conference on Wearable and Implantable Body Sensor Networks, Symposium Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 27 September 2022 through 30 September 2022
ER -