TY - JOUR
T1 - The shoutcasters, the game enthusiasts, and the AI
T2 - Foraging for explanations of real-time strategy players
AU - Penney, Sean
AU - Dodge, Jonathan
AU - Anderson, Andrew
AU - Hilderbrand, Claudia
AU - Simpson, Logan
AU - Burnett, Margaret
N1 - Publisher Copyright:
© 2021 Association for Computing Machinery.
PY - 2021/4
Y1 - 2021/4
N2 - Assessing and understanding intelligent agents is a difficult task for users who lack an AI background. "Explainable AI"(XAI) aims to address this problem, but what should be in an explanation? One route toward answering this question is to turn to theories of how humans try to obtain information they seek. Information Foraging Theory (IFT) is one such theory. In this article, we present a series of studies1 using IFT: the first investigates how expert explainers supply explanations in the RTS domain, the second investigates what explanations domain experts demand from agents in the RTS domain, and the last focuses on how both populations try to explain a state-of-the-art AI. Our results show that RTS environments like StarCraft offer so many options that change so rapidly, foraging tends to be very costly. Ways foragers attempted to manage such costs included "satisficing"approaches to reduce their cognitive load, such as focusing more on What information than on Why information, strategic use of language to communicate a lot of nuanced information in a few words, and optimizing their environment when possible to make their most valuable information patches readily available. Further, when a real AI entered the picture, even very experienced domain experts had difficulty understanding and judging some of the AI's unconventional behaviors. Finally, our results reveal ways Information Foraging Theory can inform future XAI interactive explanation environments, and also how XAI can inform IFT.
AB - Assessing and understanding intelligent agents is a difficult task for users who lack an AI background. "Explainable AI"(XAI) aims to address this problem, but what should be in an explanation? One route toward answering this question is to turn to theories of how humans try to obtain information they seek. Information Foraging Theory (IFT) is one such theory. In this article, we present a series of studies1 using IFT: the first investigates how expert explainers supply explanations in the RTS domain, the second investigates what explanations domain experts demand from agents in the RTS domain, and the last focuses on how both populations try to explain a state-of-the-art AI. Our results show that RTS environments like StarCraft offer so many options that change so rapidly, foraging tends to be very costly. Ways foragers attempted to manage such costs included "satisficing"approaches to reduce their cognitive load, such as focusing more on What information than on Why information, strategic use of language to communicate a lot of nuanced information in a few words, and optimizing their environment when possible to make their most valuable information patches readily available. Further, when a real AI entered the picture, even very experienced domain experts had difficulty understanding and judging some of the AI's unconventional behaviors. Finally, our results reveal ways Information Foraging Theory can inform future XAI interactive explanation environments, and also how XAI can inform IFT.
UR - http://www.scopus.com/inward/record.url?scp=85104940628&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85104940628&partnerID=8YFLogxK
U2 - 10.1145/3396047
DO - 10.1145/3396047
M3 - Article
AN - SCOPUS:85104940628
SN - 2160-6455
VL - 11
JO - ACM Transactions on Interactive Intelligent Systems
JF - ACM Transactions on Interactive Intelligent Systems
IS - 1
M1 - 2
ER -