2009

Self-Supervised Learning of Terrain Traversability From Proprioceptive Sensors

This system enables a vehicle to scan its surroundings and adapt to conditions by learning about them on the fly.

Robust and reliable autonomous navigation in unstructured, off-road terrain is a critical element in making unmanned ground vehicles a reality. Existing approaches tend to rely on evaluating the traversability of terrain based on fixed parameters obtained via testing in specific environments. This results in a system that handles the terrain well that it trained in, but is unable to process terrain outside its test parameters.

An adaptive system does not take the place of training, but supplements it. Whereas training imprints certain environments, an adaptive system would imprint terrain elements and the interactions amongst them, and allow the vehicle to build a map of local elements using proprioceptive sensors. Such sensors can include velocity, wheel slippage, bumper hits, and accelerometers. Data obtained by the sensors can be compared to observations from ranging sensors such as cameras and LADAR (laser detection and ranging) in order to adapt to any kind of terrain. In this way, it could sample its surroundings not only to create a map of clear space, but also of what kind of space it is and its composition.

By having a set of building blocks consisting of terrain features, a vehicle can adapt to terrain that it has never seen before, and thus be robust to a changing environment. New observations could be added to its library, enabling it to infer terrain types that it wasn’t trained on. This would be very useful in alien environments, where many of the physical features are known, but some are not. For example, a seemingly flat, hard plain could actually be soft sand, and the vehicle would sense the sand and avoid it automatically.

This work was done by Max Bajracharya, Andrew B. Howard, and Larry H. Matthies of Caltech for NASA’s Jet Propulsion Laboratory. For more information, contact This email address is being protected from spambots. You need JavaScript enabled to view it.. NPO-46601