Teleoperation of Unmanned Vehicles Using Immersive Telepresence

In order to extend the usefulness of small, unmanned ground vehicles (UGVs) to a wider range of missions, techniques are being developed to enable high-speed teleoperated control. Our goal is to quadruple the speed of teleoperated UGVs compared to currently deployed models. The key limitation is not mechanical, but in the capability of the operator to maintain situational awareness and control at higher speeds. To address these issues, we are developing technologies for immersive teleoperation and driver-assist behaviors.

altOur immersive teleoperation system uses a head-mounted display and head-aimed cameras to provide the operator with the illusion of being in the vehicle itself. Driver-assist behaviors will reduce the cognitive load on the operator by automatically avoiding obstacles while maintaining a specified heading or following a building wall or street. We’ve demonstrated immersive teleoperation on the iRobot Warrior UGV and a high-speed surrogate UGV.

Small UGVs such as the iRobot Pack-Bot have revolutionized the way in which soldiers fight wars. A typical UGV transmits video from an onboard camera back to the operator control unit (OCU) that displays the video on a computer screen. In a manner similar to playing a first-person shooter video game, the operator teleoperates the UGV using a joystick, gamepad, or other input device to control vehicle motion. While this teleoperation method works well at slow speeds in simple environments, viewing the world through a fixed camera limits the operator’s situational awareness. Even joystick-controlled cameras that pan and tilt can be distracting to operate while driving the vehicle. This is one of the reasons why small UGVs have been limited to traveling at slow speeds.

Faster, small UGVs would be useful in a wide range of military operations. When an infantry squad storms a building held by insurgents, speed is essential to maintain the advantage of surprise. When a dismounted infantry unit patrols a city on foot, the soldiers need a UGV that can keep up. However, driving at high speeds through complex urban environments is difficult for any vehicle, and small UGVs face additional challenges. Small UGVs need to steer around obstacles — for example, a bump that would be absorbed by a large vehicle’s suspension can send a small, fast-moving UGV flying into the air.

For the Stingray Project, funded by the US Army Tank-Automotive Research, Development and Engineering Center (TARDEC), iRobot Corporation (Bedford, MA) and Chatten Associates (West Conshohocken, PA) developed technologies that enable teleoperation of small UGVs at high speeds through urban terrain. Our approach combines immersive telepresence with semi-autonomous driver-assist behaviors, which command the vehicle to safely maneuver according to the driver’s intent.

In Phase I of the project, we mounted a Chatten Head-Aimed Remote Viewer (HARV) on an iRobot Warrior UGV prototype (Figure 1) and a surrogate, small UGV based on a high-speed, gas-powered, radio-controlled car platform. The operator wears a head-mounted display and a head tracker (Figure 2). The display shows the video from the HARV’s camera, which is mounted on a pan/tilt/roll gimbal. The HARV tracks the operator’s head position and turns the camera to face in the same direction.

For Phase II, we increased the Warrior UGV’s top speed by developing a high-speed, wheeled version of the Warrior (Figure 3). To assist the driver in controlling this speed, LIDAR determined the orientation of features such as street boundaries, building walls, and tree lines. With these capabilities, operators can drive the UGV at much higher speeds.

Head-Aimed Remote Viewer

Testing of remotely operated ground vehicles has shown that head-aimed vision improves teleoperation mission performance between 200 and 400%, depending on the task. In general, as the complexity of the task increases, the relative advantage provided by head-aimed vision is much greater.

Chatten Associates developed the ruggedized HARV for operating a small robotic ground vehicle. Previous experiments have shown that only head-aimed vision can provide sufficient awareness for teleoperation at these speeds. The operator wears a head-mounted display and a head tracker. As the operator turns his head, the HARV gimbal automatically turns the cameras to face the corresponding direction. This provides a far more immersive experience than aiming the camera with a joystick (even with the same head-mounted display).

White Papers

Parallel Prototyping and the Engineer’s Dilemma
Sponsored by heatron
How to Select an Analog Signal Generator
Sponsored by rohde and schwarz a and d
White Papers: Using FPGAs to Improve Embedded Designs
Sponsored by Sealevel
Why Surface Contact Pressure In Your Manufacturing Process Matters
Sponsored by sensor products
Achieving Better Adhesion with Proper Surface Preparation
Sponsored by master bond
When Wire Feedthroughs Make Sense
Sponsored by douglas electrical components

White Papers Sponsored By: