### Topics

### features

### Publications

### Issue Archive

# ICER-3D Hyperspectral Image Compression Software

Software has been developed to implement the ICER-3D algorithm. ICER-3D effects progressive, threedimensional (3D), wavelet-based compression of hyperspectral images. If a compressed data stream is truncated, the progressive nature of the algorithm enables reconstruction of hyperspectral data at fidelity commensurate with the given data volume.

# Context Modeler for Wavelet Compression of Spectral Hyperspectral Images

A context-modeling subalgorithm has been developed as part of an algorithm that effects three-dimensional (3D) wavelet-based compression of hyperspectral image data. The context-modeling subalgorithm, hereafter denoted the context modeler, provides estimates of probability distributions of wavelet-transformed data being encoded. These estimates are utilized by an entropy coding subalgorithm that is another major component of the compression algorithm. The estimates make it possible to compress the image data more effectively than would otherwise be possible.

# Methodology and Software for Designing Data-Processing ICs

### A main goal is to reduce labor and errors in the design process.

A methodology and software to implement the methodology are under development in an effort to automate at least part of the process of designing integrated-circuit (IC) chips that perform complex data-processing fun- ctions. An important element of the methodology is reuse of prior designs, with modifications as required to optimize for a specific application. This minimizes a labor-intensive, errorprone part of the design process. The prior designs include what are known in the art as intellectual-property (IP) cores — that is, designs of functional blocks [e.g., random-access memories (RAMs), communications circuits, processors] that are incorporated into larger designs. Circuits may be optimized with respect to design goals, such as reducing chip size, reducing power consumption, and/or increasing radiation hardness.

# Efficient Algorithmic Interleaver for Turbo Decoder

### Permutations are computed when needed, rather than stored in lookup tables.

An efficient bit-interleaving algorithm for a turbo encoder differs from prior such algorithms in that it does not require memory to store permutation mappings and can work with constituent decoders that produce multiple bit reliabilities per decoding stage. The algorithm can be implemented in hardware: The original version of the algorithm applies to a serially concatenated pulse position modulation (SCPPM) decoder that has been implemented in a field-programmable gate array (FPGA). The specific decoder can perform within 1 dB of the Shannon capacity on a Poisson channel and is suitable for use in optical data communications at megabit-per-second speeds. A bit interleaver is an essential component of any turbolike decoder, and the bit interleaver embodied in the present algorithm is essential for obtaining the capacity approaching performance of the specific SCPPM decoder and the affected SCPPM scheme. The algorithm can also be adapted to turbo decoders for modulation/coding schemes other than SCPPM.

# Generalized Approach to Prognosis for an Engineering System

### Software combines signal forecasting and prognostic reasoning methods to predict system failures.

This new generalized approach to prognostics can provide an automated early failure prediction of an engineering system or its components, often in time to prevent occurrence of hard failures. This approach has been demonstrated in a proof-of-concept software prototype, shown to accurately predict anomalies in the Mars Explorer Rover’s (MER) power systems using archived and model data. The approach differs from other attempted prognostic solutions in that it can interpret any sensed system trend, and not just specific failure modes with previously developed physicsof- failure models. The software employs an iterative reasoning process that implements (1) methods of forecasting signals represented by streams of sensor, telemetric, and other monitoring data and (2) new artificial intelligence methods for performing prognostic reasoning. This approach affords the following capabilities:

# Stream Flow Prediction by Remote Sensing and Genetic Programming

### A genetic programming model assimilates SAR images and geoenvironmental parameters to assess soil moisture at the watershed scale.

A genetic programming (GP)-based, nonlinear modeling structure relates soil moisture with synthetic-apertureradar (SAR) images to present representative soil moisture estimates at the watershed scale. Surface soil moisture measurement is difficult to obtain over a large area due to a variety of soil permeability values and soil textures. Point measurements can be used on a smallscale area, but it is impossible to acquire such information effectively in largescale watersheds. This model exhibits the capacity to assimilate SAR images and relevant geoenvironmental parameters to measure soil moisture.

# Low-Complexity Lossless and Near-Lossless Data Compression Technique for Multispectral Imagery

### The technique allows substantially smaller compressed file sizes when a small amount of distortion can be tolerated.

This work extends the lossless data compression technique described in “Fast Lossless Compression of Multispectral-Image Data,” (NPO- 42517) NASA Tech Briefs, Vol. 30, No. 8 (August 2006), page 26. The original technique was extended to include a near-lossless compression option, allowing substantially smaller compressed file sizes when a small amount of distortion can be tolerated. Near-lossless compression is obtained by including a quantization step prior to encoding of prediction residuals.

# Algorithm for Computing Particle/Surface Interactions

An algorithm has been devised for predicting the behaviors of sparsely spatially distributed particles impinging on a solid surface in a rarefied atmosphere. Under the stated conditions, prior particle-transport models in which (1) dense distributions of particles are treated as continuum fluids; or (2) sparse distributions of particles are considered to be suspended in and to diffuse through fluid streams are not valid.

# Safety and Quality Training Simulator

A portable system of electromechanical and electronic hardware and documentation has been developed as an automated means of instructing technicians in matters of safety and quality. The system enables elimination of most of the administrative tasks associated with traditional training. Customized, performance- based, hands-on training with integral testing is substituted for the traditional instructional approach of passive attendance in class followed by written examination.

# Rover Slip Validation and Prediction Algorithm

A physical-based simulation has been developed for the Mars Exploration Rover (MER) mission that applies a slope-induced wheel-slippage to the rover location estimator. Using the digital elevation map from the stereo images, the computational method resolves the quasi-dynamic equations of motion that incorporate the actual wheel-terrain speed to estimate the gross velocity of the vehicle.