A spacecraft guidance, navigation, and control (GN&C) system is needed to enable a spacecraft to descend to a surface, take a sample using a touch-and-go (TAG) sampling approach, and then safely ascend. At the time of this reporting, a flyable GN&C system that can accomplish these goals is beyond state of the art. This article describes AutoGNC, which is a GN&C system capable of addressing these goals, which has recently been developed and demonstrated to a maturity TRL-5-plus. The AutoGNC solution matures and integrates two previously existing JPL capabilities into a single unified GN&C system. The two capabilities are AutoNAV and G-REX. AutoNAV is JPL’s current flight navigation system, and is fairly mature with respect to flybys and rendezvous with small bodies, but is lacking capability for close surface proximity operations, sampling, and contact. G-REX is a suite of low-TRL algorithms and capabilities that enables spacecraft operations in close surface proximity and for performing sampling/contact. The development and integration of AutoNAV and G-REX components into AutoGNC provides a single, unified GN&C capability for addressing the autonomy, close-proximity, and sampling/contact aspects of small-body sample return missions.

AutoGNC is an integrated capability comprising elements that were developed separately. The main algorithms and component capabilities that have been matured and integrated are autonomy for near-surface operations, terrain-relative navigation (TRN), real-time image-based feedback guidance and control, and six degrees of freedom (6DOF) control of the TAG sampling event.

Autonomy is achieved based on an AutoGNC Executive written in Virtual Machine Language (VML) incorporating high-level control, data management, and fault protection. In descending to the surface, the AutoGNC system uses camera images to determine its position and velocity relative to the terrain. This capability for TRN leverages native capabilities of the original AutoNAV system, but required advancements that integrate the separate capabilities for shape modeling, state estimation, image rendering, defining a database of onboard maps, and performing real-time landmark recognition against the stored maps.

The ability to use images to guide the spacecraft requires the capability for image-based feedback control. In AutoGNC, navigation estimates are fed into an onboard guidance and control system that keeps the spacecraft guided along a desired path, as it descends towards its targeted landing or sampling site. Once near the site, AutoGNC achieves a prescribed guidance condition for TAG sampling (position/orientation, velocity), and a prescribed force profile on the sampling end-effector. A dedicated 6DOF TAG control then implements the ascent burn while recovering from sampling disturbances and induced attitude rates. The control also minimizes structural interactions with flexible solar panels and disallows any part of the spacecraft from making contact with the ground (other than the intended end-effector).

This work was done by John M. Carson, Nickolaos Mastrodemos, David M. Myers, Behcet Acikmese, James C. Blackmore, Dhemetrio Boussalis, Joseph E. Riedel, Simon Nolet, Johnny T. Chang, Milan Mandic, Laureano (Al) Cangahuala, Stephen B. Broschart, David S. Bayard, Andrew T. Vaughan, Tseng-Chan M. Wang, and Robert A. Werner of Caltech; Christopher A. Grasso of Blue Sun Enterprises; and Gaskell W. Robert of the Planetary Science Institute for NASA’s Jet Propulsion Laboratory. For more information, download the Technical Support Package (free white paper) at www.techbriefs.com/tsp under the Information Sciences category.

The software used in this innovation is available for commercial licensing. Please contact Daniel Broderick of the California Institute of Technology at This email address is being protected from spambots. You need JavaScript enabled to view it.. Refer to NPO-47250.