The Autonomous Exploration for Gathering Increased Science System (AEGIS) provides automated targeting for remote sensing instruments on the Mars Exploration Rover (MER) mission, which at the time of this reporting has had two rovers exploring the surface of Mars (see figure). Currently, targets for rover remote-sensing instruments must be selected manually based on imagery already on the ground with the operations team. AEGIS enables the rover flight software to analyze imagery onboard in order to autonomously select and sequence targeted remote-sensing observations in an opportunistic fashion. In particular, this technology will be used to automatically acquire sub-framed, high-resolution, targeted images taken with the MER panoramic cameras.

New MER Capability is shown for automated targeting.
This software provides:

  • Automatic detection of terrain features in rover camera images,
  • Feature extraction for detected terrain targets,
  • Prioritization of terrain targets based on a scientist target feature set, and
  • Automated re-targeting of rover remote-sensing instruments at the highest priority target.

This work was done by Benjamin J. Bornstein, Rebecca Castano, Tara A. Estlin, Daniel M. Gaines, Robert C. Anderson, David R. Thompson, Charles K. De Granville, Steve A. Chien, Benyang Tang, Michael C. Burl, and Michele A. Judd of Caltech for NASA’s Jet Propulsion Laboratory. For more information, contact This email address is being protected from spambots. You need JavaScript enabled to view it..

This software is available for commercial licensing. Please contact Daniel Broderick of the California Institute of Technology at This email address is being protected from spambots. You need JavaScript enabled to view it.. Refer to NPO-46876.