Assessment of robot guidance modalities conveying instructions to humans in emergency situations

Paul Robinette, Alan R. Wagner, Ayanna M. Howard

Research output: Chapter in Book/Report/Conference proceedingConference contribution

21 Scopus citations

Abstract

Motivated by the desire to mitigate human casualties in emergency situations, this paper explores various guidance modalities provided by a robotic platform for instructing humans to safely evacuate during an emergency. We focus on physical modifications of the robot, which enables visual guidance instructions, since auditory guidance instructions pose potential problems in a noisy emergency environment. Robotic platforms can convey visual guidance instructions through motion, static signs, dynamic signs, and gestures using single or multiple arms. In this paper, we discuss the different guidance modalities instantiated by different physical platform constructs and assess the abilities of the platforms to convey information related to evacuation. Human-robot interaction studies with 192 participants show that participants were able to understand the information conveyed by the various robotic constructs in 75.8% of cases when using dynamic signs with multi-Arm gestures, as opposed to 18.0% when using static signs for visual guidance. Of interest to note is that dynamic signs had equivalent performance to single-Arm gestures overall but drastically different performances at the two distance levels tested. Based on these studies, we conclude that dynamic signs are important for information conveyance when the robot is in close proximity to the human but multi-Arm gestures are necessary when information must be conveyed across a greater distance.

Original languageEnglish (US)
Title of host publicationIEEE RO-MAN 2014 - 23rd IEEE International Symposium on Robot and Human Interactive Communication
Subtitle of host publicationHuman-Robot Co-Existence: Adaptive Interfaces and Systems for Daily Life, Therapy, Assistance and Socially Engaging Interactions
EditorsRui Loureiro, Aris Alissandrakis, Adriana Tapus, Selma Sabanovic, Fumihide Tanaka, Yukie Nagai
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1043-1049
Number of pages7
ISBN (Electronic)9781479967636
DOIs
StatePublished - Oct 15 2014
Event23rd IEEE International Symposium on Robot and Human Interactive Communication, IEEE RO-MAN 2014 - Edinburgh, United Kingdom
Duration: Aug 25 2014Aug 29 2014

Publication series

NameIEEE RO-MAN 2014 - 23rd IEEE International Symposium on Robot and Human Interactive Communication: Human-Robot Co-Existence: Adaptive Interfaces and Systems for Daily Life, Therapy, Assistance and Socially Engaging Interactions

Other

Other23rd IEEE International Symposium on Robot and Human Interactive Communication, IEEE RO-MAN 2014
Country/TerritoryUnited Kingdom
CityEdinburgh
Period8/25/148/29/14

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Computer Science Applications
  • Human-Computer Interaction
  • Software
  • Computer Vision and Pattern Recognition

Fingerprint

Dive into the research topics of 'Assessment of robot guidance modalities conveying instructions to humans in emergency situations'. Together they form a unique fingerprint.

Cite this