Today’s mobile computing platforms (phones and tablets) contain considerable computing power and are instrumented similar to spacecraft. They have cameras, accelerometers, gyros, magnetometers, GPS, and multiple radios for communication. It was postulated that a modern commercial smartphone could make an excellent test platform for spacecraft flight software.

To demonstrate this concept, JPL flight software was loaded onto an iOS device, connected to the instrumentation, and indicated that it was working properly. The AutoNav flight software library was integrated with a Virtual Machine Language (VML 3) executive as the operational control code. The intention was to get the software properly compiled on the smartphone, and then process an image from the camera through AutoNav’s image-processing library in real time. This would demonstrate that the software was properly compiled for the target, connected to the camera, and that representative components of the flight software were operational.

AutoNav is a flight software package for autonomous optical navigation. It was used successfully by the Deep Space 1, Deep Impact, and Stardust missions. VML is a spacecraft control language authored and distributed by Blue Sun Enterprises. VML flight software has seen service on 14 flight missions.

As a result of this research effort, an Apple iPhone 6 Plus is loaded with the AutoNav/VML flight software suite. The flight software components have been wrapped in a conventional iOS application that can be started like any other smartphone app — by tapping on the screen. AutoNav and VML are then loaded and initialized, and a connection is made to the camera. Opnav-like images can then be captured with the tap of a button, the picture is passed through one of AutoNav’s image-processing routines, and the center-finding results are displayed directly on the screen by way of an image overlay.

Under the hood, both VML and AutoNav are compiled for the iPhone 6 Plus under iOS, and are operational flight components. The AutoNav image-processing algorithms operated on real camera images of representative targets. In order to fully demonstrate the other capabilities of the flight software (orbit determination, maneuver calculation, sequencing, etc.), a test arena must be configured to provide data to the camera that makes sense in a spaceflight navigation environment.

Potential next steps for

this technology include connecting to more device instruments (accelerometers, gyros, etc.); pairing with real or simulated test environments; pairing with other commodity hardware such as an external camera, I/O connections, etc.; enabling external connections via WiFi or Bluetooth; remote commanding; and telemetry downlink.

This work was done by Andrew T. Vaughan of Caltech and Christopher A. Grasso of Blue Sun Enterprises for NASA’s Jet Propulsion Laboratory. This software is available for license through the Jet Propulsion Laboratory, and you may request a license at: https://download.jpl.nasa.gov/ops/request/request_introduction.cfm . NPO-50007


NASA Tech Briefs Magazine

This article first appeared in the July, 2016 issue of NASA Tech Briefs Magazine.

Read more articles from this issue here.

Read more articles from the archives here.