The Full Immersion Telepresence Testbed (FITT) is an experimental anthropomorphic-robot remote-control system that is so named because it gives a human control operator some of the sensations of the remote environment as though the operator were in that environment in place of the robot. In comparison with older telerobotic control systems, the FITT provides robotic sensory feedback better suited to the operator’s senses and enables the operator and robot acting together to respond, with more nearly natural, humanlike motions, to changes in the remote environment. In particular, correlating the operator’s movements (principally of the head, arms, and hands) with movements of the robot creates a more intuitive method of teleoperating robotic manipulators. Moreover, the use of an operator’s own movements to control the robot provides the operator with useful kinesthetic feedback. The FITT and other systems like it are expected to reduce training time and costs, task-completion times, and operator workloads and errors. Because the FITT concept mates human intelligence with the durability of robots, it is potentially useful for performing complex tasks in harsh environments; for example, cleaning up radioactive and toxic wastes.

An Operator Using the FITT controls a remote robot by a combination of hand, arm, head, and foot motions and vocal commands.
The main component of the FITT is an operator’s chair mounted on a rotating base (see figure). By use of foot pedals on the chair, the operator can command direct- drive motors on both the base of the chair and the remote robot. The chair also houses equipment for controlling a video camera unit, manipulators, and end effectors on the remote robot. The operator wears a helmet-mounted video display unit that presents, to the operator, 60°-field-of-view stereoscopic images from the video camera unit on the robot. Stereoscopic imaging creates a perception of depth — one of the most important “immersion” features of the system. The helmet also includes stereophonic headphones for audio feedback and a microphone for operator voice commands. A position-and-orientation sensor on the top of the helmet is the source of commands that control the orientation of the video-camera unit on the remote robot. Other sensors attached to the operator’s wrists provide three-dimensional position and pitch, yaw, and roll signals for remote control of positions and orientations of tools held by the robotic manipulators. Instrumented gloves measure the operator’s finger-joint angles and the pitch and yaw of the operator’s hand, thereby providing signals for dexterous teleoperation of robotic grippers and hands.

Inasmuch as the operator’s hands and eyes are “immersed” in the locally synthesized version of the remote environment, they are not available for initiating commands. Therefore, a voice-recognition subsystem provides a convenient way to blend automated commands with direct operator control. To prevent the voice-recognition subsystem from picking up extraneous inputs, the system is set up so that the operator must press a foot pedal to enable the subsystem to receive a spoken command. Once the pedal is released, the command is processed and played back to the operator over a voice synthesizer for confirmation. Motions that can be commanded vocally can vary in complexity from a simple repositioning of a robot arm to a more complex maneuver like grappling and turning a dial.

The software that controls the system is hosted on a UNIX/VersaModule Eurocard (VME) computer workstation and a personal computer with a ‘486 microprocessor and a voice-recognition circuit board. Data from the position and orientation sensors, instrumented gloves, and foot pedals are sampled at a rate of approximately 10 Hz. The data are sent out over a local-area network, using high-level communication software that enables transfer of data through a C-language program, making low-level driver interfaces transparent.

This work was done by Larry C. H. Li of Johnson Space Center and Myron A. Diftler and Susan S. Shelton of Lockheed Martin. For further information, contact the Johnson Commercial Technology Office at 281-483-3809. MSC-22733