NGDCS Linux Application for Imaging-Spectrometer Data Acquisition and Display

NASA’s Jet Propulsion Laboratory, Pasadena, California A simple method of controlling recording and display of imaging spectrometer data in (airborne) flight was needed. Existing commercial packages were overly complicated, and sometimes difficult to operate in a bouncing plane. The software also was required to keep up with the imaging data rate, while still running on commodity hardware and a desktop operating system. Finally, the software needed to be as robust as possible — repeating a flight because of lost data is sometimes impossible, and always expensive.

Posted in: Briefs, Displays/Monitors/HMIs, Data Acquisition


Detection of Carried and Dropped Objects in Surveillance Video

This software analyzes a video input stream and automatically detects carried and dropped objects in near-real-time. NASA’s Jet Propulsion Laboratory, Pasadena, California DARPA’s Mind’s Eye Program aims to develop a smart camera surveillance system that can autonomously monitor a scene and report back human-readable text descriptions of activities that occur in the video. An important aspect is whether objects are brought into the scene, exchanged between persons, left behind, picked up, etc. While some objects can be detected with an object-specific recognizer, many others are not well suited for this type of approach. For example, a carried object may be too small relative to the resolution of the camera to be easily identifiable, or an unusual object, such as an improvised explosive device, may be too rare or unique in its appearance to have a dedicated recognizer. Hence, a generic object detection capability, which can locate objects without a specific model of what to look for, is used. This approach can detect objects even when partially occluded or overlapping with humans in the scene.

Posted in: Briefs, TSP, Cameras, Electronics & Computers, Data Acquisition, Detectors


Visualization of fMRI Network Data

NASA’s Jet Propulsion Laboratory, Pasadena, California Functional connections within the brain can be revealed through functional magnetic resonance imaging (fMRI), which shows simultaneous activations of blood flow in the brain during response tests. However, fMRI specialists currently do not have a tool for visualizing the complex data that comes from fMRI scans. They work with correlation matrices that table what functional region connections exist, but they have no corresponding visualization.

Posted in: Briefs, TSP, Visualization Software, Electronics & Computers, Data Acquisition


Viewpoints Software for Visualization of Multivariate Data

Ames Research Center, Moffett Field, California Viewpoints software allows interactive visualization of multi-variate data using a variety of standard techniques. The software is built exclusively from high-performance, cross-platform, open-source, standards-compliant languages, libraries, and components. The techniques included are:

Posted in: Briefs, Visualization Software, Electronics & Computers, Data Acquisition, Mathematical/Scientific Software


Algorithm for Estimating PRC Wavefront Errors from Shack-Hartmann Camera Images

Phase retrieval is used for the calibration and the fine-alignment of an optical system. NASA’s Jet Propulsion Laboratory, Pasadena, California Phase retrieval (PR) and Shack-Hartmann Sensor (SHS) are the two preferred methods of image-based wavefront sensing widely used in various optical testbeds, adaptive optical systems, and ground- and space-based telescopes. They are used to recover the phase information of an optical system from defocused point source images (PR) and focused point source or extended scene images (SHS). For example, the Terrestrial Planet Finder Coronagraph’s (TPF-C’s) High-Contrast Imaging Testbed (HCIT) uses a PR camera (PRC) to estimate, and subsequently correct, the phase error at the exit pupil of this optical system. Several other test-beds at JPL were, and will be, equipped with both a PRC and a Shack-Hartmann camera (SHC).

Posted in: Briefs, TSP, Cameras, Optics, Sensors


Head-Worn Display Concepts for Ground Operations for Commercial Aircraft

This display enables a higher level of safety during ground operations, including taxiway navigation and situational awareness. Langley Research Center, Hampton, Virginia The Integrated Intelligent Flight Deck (IIFD) project, part of NASA’s Aviation Safety Program (AvSP), comprises a multi-disciplinary research effort to develop flight deck technologies that mitigate operator-, automation-, and environment-induced hazards. Toward this objective, the IIFD project is developing crew/vehicle interface technologies that reduce the propensity for pilot error, minimize the risks associated with pilot error, and proactively overcome aircraft safety barriers that would otherwise constrain the next full realization of the Next Generation Air Transportation System (NextGen). Part of this research effort involves the use of synthetic and enhanced vision systems and advanced display media as enabling crew-vehicle interface technologies to meet these safety challenges.

Posted in: Articles, Briefs, TSP, Aviation, Displays/Monitors/HMIs


Flight Imagery Recorder Locator (FIRLo) and High-Temperature Radome

This technology is applicable to the commercial airline industry for locating “black boxes.” NASA’s Jet Propulsion Laboratory, Pasadena, California LDSD (Low Density Supersonic Decelerator) is a Mars EDL (entry, descent, and landing) Technology Development Project that launches three test vehicles out of the Pacific Missile Range Facility in Kauai. On the test vehicle, most mission science data can be recorded safely on land; however, high-speed and high-resolution imagery cannot be telemetered due to bandwidth constraints. Therefore, all information had to be recorded solely onboard the test vehicle; this unit is called the flight imagery recorder (FIR). A typical commercial airliner “black box” is only capable of recording on the order of gigabytes of data, whereas this work required on the order of terabytes (a few orders of magnitude larger).

Posted in: Articles, Briefs


Imaging Space System Architectures Using a Granular Medium as a Primary Concentrator

Higher-resolution optics provide improved hyperspectral imaging for ocean and land monitoring, as well as exoplanet detection. NASA’s Jet Propulsion Laboratory, Pasadena, California Typically, the cost of a space observatory is driven by the size and mass of the primary aperture. Generally, a monolithic aperture is much heavier and complex to fabricate (hence, more costly) than an aperture of the same size but composed of much smaller units. Formation flying technology, as applied to swarm systems in space, is an emerging discipline.

Posted in: Imaging, Briefs, TSP


Novel Hemispherical Dynamic Camera for EVAs

A novel optical design for imaging systems is able to achieve an ultra-wide field of view (UW-FOV) of up to 208°. The design uses an integrated optical design (IOD). The UW-FOV optics design reduces the wasted pixels by 49% when compared against the baseline fisheye lens. The IOD approach results in a design with superior optical performance and minimal distortion.

Posted in: Briefs


Reducing Drift in Stereo Visual Odometry

The drift was reduced from an uncorrected 47 cm to just 7 cm. Visual odometry (VO) refers to the estimation of vehicle motion using onboard cameras. A common mode of operation utilizes stereovision to tri angulate a set of image features, track these over time, and infer vehicle motion by computing the apparent point cloud motion with respect to the cameras. It has been observed that stereo VO is subject to drift over time.

Posted in: Briefs