The Full Immersion Telepresence Testbed (FITT) is an experimental anthropomorphic-robot remote-control system that is so named because it gives a human control operator some of the sensations of the remote environment as though the operator were in that environment in place of the robot. In comparison with older telerobotic control systems, the FITT provides robotic sensory feedback better suited to the operator’s senses and enables the operator and robot acting together to respond, with more nearly natural, humanlike motions, to changes in the remote environment. In particular, correlating the operator’s movements (principally of the head, arms, and hands) with movements of the robot creates a more intuitive method of teleoperating robotic manipulators. Moreover, the use of an operator’s own movements to control the robot provides the operator with useful kinesthetic feedback. The FITT and other systems like it are expected to reduce training time and costs, task-completion times, and operator workloads and errors. Because the FITT concept mates human intelligence with the durability of robots, it is potentially useful for performing complex tasks in harsh environments; for example, cleaning up radioactive and toxic wastes.

Inasmuch as the operator’s hands and eyes are “immersed” in the locally synthesized version of the remote environment, they are not available for initiating commands. Therefore, a voice-recognition subsystem provides a convenient way to blend automated commands with direct operator control. To prevent the voice-recognition subsystem from picking up extraneous inputs, the system is set up so that the operator must press a foot pedal to enable the subsystem to receive a spoken command. Once the pedal is released, the command is processed and played back to the operator over a voice synthesizer for confirmation. Motions that can be commanded vocally can vary in complexity from a simple repositioning of a robot arm to a more complex maneuver like grappling and turning a dial.
The software that controls the system is hosted on a UNIX/VersaModule Eurocard (VME) computer workstation and a personal computer with a ‘486 microprocessor and a voice-recognition circuit board. Data from the position and orientation sensors, instrumented gloves, and foot pedals are sampled at a rate of approximately 10 Hz. The data are sent out over a local-area network, using high-level communication software that enables transfer of data through a C-language program, making low-level driver interfaces transparent.
This work was done by Larry C. H. Li of Johnson Space Center and Myron A. Diftler and Susan S. Shelton of Lockheed Martin. For further information, contact the Johnson Commercial Technology Office at 281-483-3809. MSC-22733

