Developing an effective immersive training tool requires a fine balance of technology capabilities with human factors. Achieving a training goal to build squad leadership skills, squad-level communication, and mission rehearsal for an entire squadron brings together several components, including advanced body sensors that must translate physical movements, avatar actions, and high-resolution headmounted displays (HMDs) to immerse the user in the virtual world.

Figure 1. The ExpeditionDI technology immerses the nine-unity combat squad in a virtual battle environment.
Pilots have honed their flying skills through the use of simulators since the Link Company’s introduction of the Blue Box in the early 1930s. Since then, technology has advanced to the point where military and commercial pilots can now receive certifications entirely in a simulator. Similar advances, led by NASA and the US military, have extended high-fidelity simulation to spacecraft, ship’s bridges, ground vehicles, heavy construction equipment, and almost anything else that moves under control of a human operator.

In order to build effective simulators, it is important to step back from technology and first understand the human body and the training intent. Effective simulators tap key senses of the human body — sight, sound, touch, and smell. In order for simulators to be effective training tools, the simulator must convince the trainee that he or she is in the real environment, requiring the person to mimic real-life physical motion or action in the virtual location. Flight simulators, for example, put pilots into a realistic look-and-feel of a cockpit and surround the pilot with displays to replicate a life-like visual environment. Anything short of that would decrease training effectiveness.

Effective simulation training for a nineunit combat squad (see Figure 1) in the US Army poses significant challenges. How do you immerse a squad so they feel they are in a battle environment? How do you teach or reinforce key muscle memory of visual scanning, weapons movement, or replacing weapons clips while in a simulator? Putting a squadron, a keyboard, and a mouse at a computer screen may provide some visual cues, but it does not teach or reinforce muscle memory that will help soldiers in a real battle environment; clicking on a mouse button is not the same as holding, aiming, and firing a real weapon.


Figure 2. The ExpeditionDI HMD displays a 60- degree FOV.
The first challenge is immersing all of a squad’s soldiers into a virtual environment. Until the holodeck technology from Star Trek is available, the next best choice is to have each soldier put on a high-resolution head-mounted display (see Figure 2). A 1280 x 1024-resolution OLED microdisplay provides enough pixels to display the virtual environment. With a display area of about 15 x 12 mm, however, advanced optics is needed to magnify the microdisplay. Optics with a wide field-of-view (FOV) better immerses the soldier, but that comes at a cost. Magnified pixels lose acuity, or the sharpness of the image.

Greatly magnified pixels also exhibit other optics distortions, such as a barrel (a bulge) or a pincushion (a bend) of the image. These are normal distortions from magnifying images through optics. Going to a wider FOV increases the effects of optics deformation.

Balancing field of view, acuity, and the amount of distortion to provide the best visual experience is a combination of technology, physics, and human vision. The ExpeditionDI HMD displays a 60-degree FOV; having a greater field of view decreases acuity, therefore lessening the simulator’s effectiveness. Image pre-processing on the CPU adjusts the image to minimize optics distortions, resulting in the desired level of visual immersion.

Touch — Movement/ Motion Tracking

Figure 3. Arms, legs, body, and head position need to be accurately tracked so that body motion translates to the movement of the soldier’s avatar in the virtual world.
Soldiers walk and run through the environment, duck and cover, change clips on their weapons, make gestures, crouch, and lie prone — important skills taught to every infantry soldier. In a virtual environment, those skills must be preserved in their avatars. Simulators cannot replace these learned techniques, but need to reinforce the core skills for effective training.

Arms, legs, body, and head position need to be accurately tracked so that body motion can translate to the movement of the soldier’s avatar in the virtual world (see Figure 3). If a head turns to the left, for example, the virtual image in the HMD should be shown moving to the left at the same rate. There are different technologies available to track the body: MEMS inertial technology motion trackers, gyro trackers, and markered and markerless motion capture systems.

« Start Prev 1 2 Next End»

The U.S. Government does not endorse any commercial product, process, or activity identified on this web site.