Evaluating Immersive Military Training System

Military training has been one of the most successful applications of mixed reality (MR) technologies. MR systems (including virtual reality (VR) and augmented reality (AR) provide a realistic first-person view of a synthetic world or an augmented real world environment, and allow trainees to interact directly with that world, making them useful for training motor skills, decision-making, and navigation. However, there are a wide range of MR technologies, and very little is known about which technologies provide the most benefit for various military training scenarios. In order to achieve the best return on investment, the Navy needs empirical data on the effectiveness of VR technologies.

To address this gap in our knowledge, empirical studies are needed. However, the fact that MR technologies are diverse and constantly changing limit our ability to perform studies with generalizable results. In our project, we address this limitation with two major contributions. First, we propose a general framework around the concept of immersion, or display fidelity, for characterizing and understanding MR displays. Second, we contribute an evaluation method called MR simulation, in which existing displays, non-existent displays, and even the real world can be simulated within a high-fidelity VR system.

Together, these two contributions allow us to conduct controlled experiments examining the influence of display fidelity on the effectiveness of training systems, and to extract general findings from these experiments that can be applied to the design of new systems. In practical terms, our contributions enable the generation of guidelines for choosing the best display hardware for new training systems and for performing cost/benefit analysis of proposed new display technologies.

Our activities to date have focused on task/domain analysis, MR simulator software, validation studies, and controlled studies in MR simulators.

Task/Domain Analysis

  • Completed a literature review of the use of MR systems in military training.
  • Discussed our work with Richard Arnold of the Naval Aerospace Medical Research Laboratory, and gained insight on task analysis of training tasks in military training simulators.
  •  Visited the Naval Research Laboratory to discuss our work with Dr. Jim Templeman, Dr. Linda Sibert, and Dr. Mark Livingston, and gained insight into training tasks relevant to the Navy that would be appropriate for our empirical studies.
  • Visited the Institute for Creative Technologies at the University of Southern California to meet with Dr. David Krum and see the MR training systems that have been developed there.

MR simulator software

  • Developed an initial version of an MR simulator software architecture using the Vizard VR toolkit.
  • Designed a software architecture that will allow MR simulation with multiple display platforms, including head-mounted displays (HMDs), surround-screen displays, and the UCSB Allosphere.
  • Developed software tools allowing us to independently control field of view (FOV), field of regard (FOR), stereoscopy, view rotation amplification, jitter, frame rate, artificial latency, and simulator latency.

Validation studies

  •  Replicated a study from the MR literature on the effects of latency on an object manipulation task (left-hand image below). This demonstrated that we could use the MR simulator to reproduce a real-world AR system. We found the same significant effects, but absolute performance values were not equivalent.
  • Conducted a follow-up study demonstrating that the performance differences in the replication study were due to latency inherent in the simulator, and that the effects of simulator latency and artificial latency were additive. Together, these studies support our hypothesis that MR simulation produces valid results.
  • Currently running a study designed to demonstrate that simulator latency has no effect on a purely visual task.
  • Planning a study designed to demonstrate that results from a study using simulated displays are equivalent to results from a study using the actual displays.

Controlled studies in MR simulators

  • Conducted a study examining the effects of amplified head rotations on visual scanning performance in a military setting. Found that although amplification was difficult for users to detect, high levels of amplification could degrade performance in a counting task.
  • Conducted three experiments in the Duke Immersive Virtual Environment (DiVE) to study the effects of display fidelity and interaction fidelity on performance, presence, engagement, and usability of a first-person shooter (FPS) game, such as those sometimes used for military training (center image below). Results showed that conditions simulating a desktop PC and simulating the real world outperformed conditions that had only high display fidelity or high interaction fidelity, but not both. Fully surrounding screens and 3D pointing were especially beneficial for visual search, aiming, and firing tasks.
  • Conducted an experiment to examine recovery methods for vision-based AR tracking systems.
  • Conducted a study comparing static visual highlighting to stereoscopic highlighting (a new technique we developed) on 2D and 3D graph layouts.
  • Currently conducting a study examining the effects of display fidelity on training effectiveness for a visual scanning training task.
  • Planning an experiment to study the effects of amplification on spatial understanding and training effectiveness.

Journal Articles

Eric D Ragan; Siroberto Scerbo; Felipe Bacim; Doug A Bowman

Amplified Head Rotation in Virtual Reality and the Effects on 3D Search, Training Transfer, and Spatial Orientation Journal Article

In: IEEE Transactions on Visualization and Computer Graphics, vol. 23, pp. 1880 - 1895, 2017, ISSN: 1077-2626.

Links | BibTeX

Eric D Ragan; Doug A Bowman; Regis Kopper; Cheryl Stinson; Siroberto Scerbo; Ryan P McMahan

Effects of field of view and visual complexity on virtual reality training effectiveness for a visual scanning task Journal Article

In: IEEE Transactions on Visualization and Computer Graphics, pp. 14 pages, 2015, ISSN: 1077-2626.

Links | BibTeX

Ryan P McMahan; Doug A Bowman; David J Zielinski; Rachael B Brady

Evaluating Display Fidelity and Interaction Fidelity in a Virtual Reality Game Journal Article

In: IEEE Transactions on Visualization and Computer Graphics, vol. 18, no. 4, pp. 626 - 633, 2012, ISSN: 1077-2626.

Links | BibTeX

Conferences

Doug A Bowman; Eric D Ragan; Siroberto Scerbo; Felipe Bacim

Evaluating the Impact of Head Rotation Amplification on Virtual Reality Training Effectiveness Conference

Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC), 2013.

BibTeX

Doug A Bowman; Cheryl Stinson; Eric D Ragan; Siroberto Scerbo; Tobias Hollerer; Cha Lee; Ryan P McMahan; Regis Kopper

Evaluating effectiveness in virtual environments with MR simulation Conference

Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC), National Training and Simulation Association National Training and Simulation Association, Orlando, FL, 2012.

Links | BibTeX

Cha Lee; Scott Bonebrake; Doug A Bowman; Tobias Hollerer

The role of latency in the validity of AR simulation Conference

2010 IEEE Virtual Reality Conference (VR)2010 IEEE Virtual Reality Conference (VR), IEEE IEEE, Boston, MA, USA, 2010, ISBN: 978-1-4244-6237-7.

Links | BibTeX

Sponsor

, , , , ,