Tech Briefs

Improvement in Recursive Hierarchical Segmentation of Data

A Segmentation of the Landsat ETM+ Image displayed on the left is shown on the right. The new approach eliminates processing artifacts. A further modification has been made in the algorithm and implementing software reported in “Modified Recursive Hierarchical Segmentation of Data” (GSC-14681-1), NASA Tech Briefs, Vol. 30, No. 6 (June 2006), page 51. That software performs recursive hierarchical segmentation of data having spatial characteristics (e.g., spectralimage data). The output of a prior version of the software contained artifacts, including spurious segmentation-image regions bounded by processing-window edges.The modification for suppressing the artifacts, mentioned in the cited article, was addition of a subroutine that analyzes data in the vicinities of seams to find pairs of regions that tend to lie adjacent to each other on opposite sides of the seams. Within each such pair, pixels in one region that are more similar to pixels in the other region are reassigned to the other region. The present modification provides for a parameter ranging from 0 to 1 for controlling the relative priority of merges between spatially adjacent and spatially non-adjacent regions. At 1, spatially-adjacent-/ spatially-non-adjacent- region merges have equal priority. At 0, only spatially-adjacent-region merges (no spectral clustering) are allowed. Between 0 and 1, spatially-adjacent- region merges have priority over spatially-non-adjacent ones.

Posted in: Briefs, TSP, Software, Computer software and hardware


Using Heaps in Recursive Hierarchical Segmentation of Data

speed has been made in the algorithm and implementing software reported in “Modified Recursive Hierarchical Segmentation of Data” (GSC-14681-1), NASA Tech Briefs, Vol. 30, No. 6 (June 2006), page 51. That software performs recursive hierarchical segmentation of data having spatial characteristics (e.g., spectral-image data). The segmentation process includes an iterative subprocess, in each iteration of which it is necessary to determine a best pair of regions to merge [merges being justified by one or more measure(s) similarity of pixels in the regions]. In the previously reported version of the algorithm and software, the choice of a best pair of regions to merge involved the use of a fully sorted list of regions. That version was computationally inefficient because a fully sorted list is not needed: what is needed is only the identity of the pair of regions characterized by the smallest measure of dissimilarity. The present modification replaces the use of a fully sorted list with the use of data heaps, which are computationally more efficient for performing the required comparisons among dissimilarity measures. The modification includes the incorporation of standard and modified functions for creating and updating data heaps.

Posted in: Briefs, TSP, Software, Computer software and hardware, Imaging and visualization, Data management


Tool for Statistical Analysis and Display of Landing Sites

MarsLS is a software tool for analyzing statistical dispersion of spacecraft-landing sites and displaying the results of its analyses. Originally intended for the Mars Explorer Rover (MER) mission, MarsLS is also applicable to landing sites on Earth and non-MER sites on Mars. MarsLS is a collection of interdependent MATLAB scripts that utilize the MATLAB graphical-user- interface software environment to display landing-site data (see figure) on calibrated image-maps of the Martian or other terrain. The landing- site data comprise latitude/longitude pairs generated by Monte Carlo runs of other computer programs that simulate entry, descent, and landing. Using these data, MarsLS can compute a landing-site ellipse — a standard means of depicting the area within which the spacecraft can be expected to land with a given probability. MarsLS incorporates several features for the user’s convenience, including capabilities for drawing lines and ellipses, overlaying kilometer or latitude/longitude grids, drawing and/or specifying lines and/or points, entering notes, defining and/or displaying polygons to indicate hazards or areas of interest, and evaluating hazardous and/or scientifically interesting areas. As part of such an evaluation, MarsLS can compute the probability of landing in a specified polygonal area.

Posted in: Briefs, TSP, Software, Statistical analysis, Displays, Entry, descent, and landing, Spacecraft


Automated Assignment of Proposals to Reviewers

A computer program automates the process of selecting unbiased peer reviewers of research proposals submitted to NASA. Heretofore, such selection has been performed by manual searching of two large databases subject to a set of assignment rules. One database lists proposals and proposers; the other database lists potential reviewers. The manual search takes an average of several weeks per proposal. In contrast, the present software can perform the selection in seconds. The program begins by selecting one entry from each database, then applying the assignment rules to this pair of entries. If and only if all the assignment rules are satisfied, the chosen reviewer is assigned to the chosen proposal. The assignment rules enforced by the program are (1) a maximum allowable number of proposals assigned to a single reviewer; (2) a maximum allowable number of reviewers assigned to a single proposal; (3) if the proposing team includes a member affiliated with an industry, then the reviewer must not be affiliated with any industry; and (4) the reviewer must not be a member of the proposing team or affiliated with the same institution as that of a member of the proposing team.

Posted in: Briefs, TSP, Software, Computer software and hardware, Data management


Array-Pattern-Match Compiler for Opportunistic Data Analysis

A computer program has been written to facilitate real-time sifting of scientific data as they are acquired to find data patterns deemed to warrant further analysis. The patterns in question are of a type denoted array patterns, which are specified by nested parenthetical expressions. [One example of an array pattern is ((>3) 0 (≠1)): this pattern matches a vector of at least three elements, the first of which exceeds 3, the second of which is 0, and the third of which does not equal 1.] This program accepts a high-level description of a static array pattern and compiles a highly optimal and compact other program to determine whether any given instance of any data array matches that pattern. The compiler implemented by this program is independent of the target language, so that as new languages are used to write code that processes scientific data, they can easily be adapted to this compiler. This program runs on a variety of different computing platforms. It must be run in conjunction with any one of a number of Lisp compilers that are available commercially or as shareware.

Posted in: Briefs, TSP, Software, Analysis methodologies, Data acquisition and handling


Pre-Processor for Compression of Multispectral Image Data

A computer program that preprocesses multispectral image data has been developed to provide the Mars Exploration Rover (MER) mission with a means of exploiting the additional correlation present in such data without appreciably increasing the complexity of compressing the data. When used in conjunction with ICER, a previously developed image-data-compression program, this program enables improved compression of multispectral images, compared to that achievable by use of ICER alone. As such, it is a straightforward means of achieving much of the gain possible from exploiting spectral correlation. This preprocessor software accommodates up to seven images that are different spectral bands of the same scene. The software performs an approximate discrete cosine transform (DCT) pixelwise across the spectral bands. The software is written for speed; in particular the DCT operation performs only integer operations (producing integer output) and uses multiplications sparingly. Separate code is used for each possible number of spectral bands, including numbers for which fast DCT functions are not normally implemented. The DCT output is scaled so that, if the original images have a bit depth of at most 12, the transformed images are guaranteed to have a dynamic range appropriate for compression by the ICER software on the MER rovers. The resulting transformed bands are compressed individually by ICER. To reconstruct the images, the transformed images are first decompressed by use of the decompressor for ICER, then the resulting reconstructed images are passed to an inverse-DCT subprogram, which reconstructs the various spectral bands.

Posted in: Briefs, TSP, Software, Computer software and hardware, Imaging and visualization, Data management


Compressing Image Data While Limiting the Effects of Data Losses

ICER is computer software that can perform both lossless and lossy compression and decompression of gray-scaleimage data using discrete wavelet transforms. Designed for primary use in transmitting scientific image data from distant spacecraft to Earth, ICER incorporates an error-containment scheme that limits the adverse effects of loss of data and is well suited to the data packets transmitted by deep-space probes. The error-containment scheme includes utilization of the algorithm described in “Partitioning a Gridded Rectangle Into Smaller Rectangles” (NPO-30479), NASA Tech Briefs, Vol. 28, No. 7 (July 2004), page 56. ICER has performed well in onboard compression of thousands of images transmitted from the Mars Exploration Rovers.

Posted in: Briefs, TSP, Software, Imaging and visualization, Data management


The U.S. Government does not endorse any commercial product, process, or activity identified on this web site.