The figure depicts salient features of the optical layout of a desktop-scale virtual-reality system that is specialized for simulating a glove-box work space. The system generates stereoscopic left- and right-eye images of the interior of the work space that show (1) a user extension (defined here as an arm, hand, and/or one or more of the fingers on the hand of a user) and (2) either a real or a virtual workpiece to be manipulated by the user extension. The positions and orientations of the user extension and workpiece in the virtual images coincide substantially with those of the real user extension and of the real workpiece if one exists. The images are compensated for the non-central positions of the user’s eyes relative to a viewing screen on which they are presented.

This Virtual-Reality System simulates a work space immediately in front of the user, as in a glove box. The system generates stereoscopic images that are compensated for changing positions of the user’s eyes.
The system consists of three major parts: the physical work space, a computer subsystem, and a display subsystem. In addition to the user extension, the physical work space contains sensors that track the position and orientation of the user extension, sensors that track the positions of the user’s eyes, and sensors that track the position and orientation of the workpiece (if any). Optionally, attached to the user extension is a pen-style haptic device — an assembly that can include actuators that apply forces to the end of a pen stylus held by a user to simulate contact between the stylus and a virtual workpiece. The sensor readings are digitized in real-time and sent to the computer subsystem.

The computer subsystem runs software, denoted a simulation engine, that simulates the manipulation of the workpiece by the user extension. The simulation engine includes mathematical models of all relevant aspects of the three-dimensional geometry and physical properties of the user extension and the workpiece, and calculates interactions between objects using a physics based force model. The simulation utilizes the tracking data provided by the sensors plus stored data (denoted virtual data) on the dimensions and physical properties of the user extension and the workpiece. The simulation can include such details as the positions and orientations of finger segments, contact forces with the stylus, and even deflections of the workpiece if the workpiece is a deformable body.

The output of the simulation engine is fed to a software module that generates nominal left- and right-eye views for eyes that are spaced apart by an amount that can be adjusted to match the interpupillary distance of the user. The views are then processed by a software module that uses the eye-tracking sensory data to correct for the deviation of the user’s eye positions from nominal viewing positions. An important part of the view-correction module is a submodule in which the geometry of the viewing screen and the user’s eyes is represented by means of an asymmetric frustum that is repeatedly updated in response to the eye-tracking data.

The outputs of the view-correction module are synchronized by an alignment sub-module, then fed to the display subsystem, wherein two display devices generate the left- and right-eye images. The images are projected onto the viewing screen through circularly polarizing filters, such that the left and right images are polarized orthogonally to each other. The user wears a pair of correspondingly polarized eyeglasses so that the left or right eye sees only the left or right image, respectively.

A somewhat more complex version of the system can include two or more physical-work-space/display subsystem assemblies, possibly at different locations. The simulation engine from each work-space is connected via a network to the other simulation engines. This system would make it possible for multiple remote users to work together, or simulate working together, on the same workpiece.

This work was done by J. D. Smith and R. D. Boyle of Ames Research Center and I. Twombly of Universities Space Research.

This invention is owned by NASA and a patent application has been filed. Inquiries concerning rights for the commercial use of this invention should be addressed to

the Ames Technology Partnerships Division at (650) 604- 2954.

Refer to ARC-14756-1.