TY - GEN
T1 - Exploring and promoting diagnostic transparency and explainability in online symptom checkers
AU - Tsai, Chun Hua
AU - You, Yue
AU - Gui, Xinning
AU - Kou, Yubo
AU - Carroll, John M.
N1 - Publisher Copyright:
© 2021 ACM.
PY - 2021/5/6
Y1 - 2021/5/6
N2 - Online symptom checkers (OSC) are widely used intelligent systems in health contexts such as primary care, remote healthcare, and epidemic control. OSCs use algorithms such as machine learning to facilitate self-diagnosis and triage based on symptoms input by healthcare consumers. However, intelligent systems' lack of transparency and comprehensibility could lead to unintended consequences such as misleading users, especially in high-stakes areas such as healthcare. In this paper, we attempt to enhance diagnostic transparency by augmenting OSCs with explanations. We first conducted an interview study (N=25) to specify user needs for explanations from users of existing OSCs. Then, we designed a COVID-19 OSC that was enhanced with three types of explanations. Our lab-controlled user study (N=20) found that explanations can significantly improve user experience in multiple aspects. We discuss how explanations are interwoven into conversation flow and present implications for future OSC designs.
AB - Online symptom checkers (OSC) are widely used intelligent systems in health contexts such as primary care, remote healthcare, and epidemic control. OSCs use algorithms such as machine learning to facilitate self-diagnosis and triage based on symptoms input by healthcare consumers. However, intelligent systems' lack of transparency and comprehensibility could lead to unintended consequences such as misleading users, especially in high-stakes areas such as healthcare. In this paper, we attempt to enhance diagnostic transparency by augmenting OSCs with explanations. We first conducted an interview study (N=25) to specify user needs for explanations from users of existing OSCs. Then, we designed a COVID-19 OSC that was enhanced with three types of explanations. Our lab-controlled user study (N=20) found that explanations can significantly improve user experience in multiple aspects. We discuss how explanations are interwoven into conversation flow and present implications for future OSC designs.
UR - http://www.scopus.com/inward/record.url?scp=85106748680&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85106748680&partnerID=8YFLogxK
U2 - 10.1145/3411764.3445101
DO - 10.1145/3411764.3445101
M3 - Conference contribution
AN - SCOPUS:85106748680
T3 - Conference on Human Factors in Computing Systems - Proceedings
BT - CHI 2021 - Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems
PB - Association for Computing Machinery
T2 - 2021 CHI Conference on Human Factors in Computing Systems: Making Waves, Combining Strengths, CHI 2021
Y2 - 8 May 2021 through 13 May 2021
ER -