Even small changes in a soldier’s surroundings could indicate danger. Now, a robot can detect those changes and a warning could immediately alert the soldier through a display in her eyeglasses. Researchers demonstrated in a real-world environment the first human-robot team in which the robot detects physical changes in 3D and shares that information with a human in real time through augmented reality. The human is then able to evaluate the information received and decide follow-on action.

The work was done to provide contextual awareness to autonomous robotic ground platforms in maneuver and mobility scenarios. Most academic research in the use of mixed reality interfaces for human-robot teaming does not enter real-world environments but rather uses external instrumentation in a lab to manage the calculations necessary to share information between a human and robot. Likewise, most engineering efforts to provide humans with mixed-reality interfaces do not examine teaming with autonomous mobile robots.

The new research paired a small autonomous mobile ground robot, equipped with laser ranging sensors known as LiDAR, to build a representation of the environment with a human teammate wearing augmented reality glasses. As the robot patrolled the environment, it compared its current and previous readings to detect changes in the environment. Those changes were then instantly displayed in the human’s eyewear to determine whether the human could interpret the changes in the environment.

In studying communication between the robot and human team, the researchers tested different resolution LiDAR sensors on the robot to collect measurements of the environment and detect changes. When those changes were shared using augmented reality to the human, the researchers found that human teammates could interpret changes that even the lower-resolution LiDARs detected. This indicates that depending on the size of the changes expected to encounter, lighter, smaller, and less expensive sensors could perform just as well and run faster in the process.

Future studies will continue to explore how to strengthen the teaming between humans and autonomous agents by allowing the human to interact with the detected changes, which will provide more information to the robot about the context of the change; for example, changes made by adversaries versus natural environmental changes or false positives. This will improve the autonomous context understanding and reasoning capabilities of the robotic platform such as by enabling the robot to learn and predict what types of changes constitute a threat. In turn, providing this understanding to autonomy will help researchers learn how to improve teaming of soldiers with autonomous platforms.

For more information, contact the U.S. Army CCDC Army Research Laboratory Public Affairs at 703-693-6477.