User capabilities in eyes-free spatial target acquisition in immersive virtual reality environments

Huiyue Wu, Yanyi Deng, Jiajun Pan, Tianxing Han, Yonglin Hu, Kaini Huang, Xiaolong (Luke) Zhang

Research output: Contribution to journalArticlepeer-review

10 Scopus citations

Abstract

In immersive virtual reality (VR) environments, users rely on the vision channel to search for objects. Such eyes-engaged interactive techniques may significantly degrade the interaction efficiency and user experience, particularly when users have to turn their head frequently to search for a target object in the limited field of view (FOV) of a head-mounted display (HMD). In this study, we systematically investigated user capabilities in eyes-free spatial target acquisition considering different horizontal angles, vertical angles, distances from the user's body, and body sides. Our results show that high acquisition accuracy and low task load are achieved for target locations at front and middle horizontal angles as well as those at middle vertical angles. Meanwhile, a trade-off cannot be achieved between the acquisition accuracy and the task load for target locations at long distances from the user's body. In addition, the acquisition accuracy and task load for the target locations vary with the body side. Our research findings can provide a deeper understanding of user capability in eyes-free target acquisition and offer concrete design guidelines for appropriate target arrangement for eyes-free target acquisition in immersive VR environments.

Original languageEnglish (US)
Article number103400
JournalApplied Ergonomics
Volume94
DOIs
StatePublished - Jul 2021

All Science Journal Classification (ASJC) codes

  • Human Factors and Ergonomics
  • Physical Therapy, Sports Therapy and Rehabilitation
  • Safety, Risk, Reliability and Quality
  • Engineering (miscellaneous)

Fingerprint

Dive into the research topics of 'User capabilities in eyes-free spatial target acquisition in immersive virtual reality environments'. Together they form a unique fingerprint.

Cite this