Special Coverage

Home

Automated Activation and Deactivation of a System Under Test

The MPLM Automated Activation/Deactivation application (MPLM means Multi-Purpose Logistic Module) was created with a three-fold purpose in mind: To reduce the possibility of human error in issuing commands to, or interpreting telemetry from, the MPLM power, computer, and environmental control systems; To reduce the amount of test time required for the repetitive activation/deactivation processes; and To reduce the number of on-console personnel required for activation/ deactivation.

Posted in: Briefs

Read More >>

HFGMC Enhancement of MAC/GMC

Additional information about a mathematical model denoted the high-fidelity generalized method of cells (HFGMC) and implementation of the HFGMC within version 4.0 of the MAC/GMC software has become available. MAC/GMC (Micromechanics Analysis Code With Generalized Method of Cells) was a topic of several prior NASA Tech Briefs articles, version 4.0 having been described in “Comprehensive Micromechanics-Analysis Code — Version 4.0” (LEW-17495-1), NASA Tech Briefs, Vol. 29, No. 9 (September 2005), page 54. MAC/GMC predicts elastic and inelastic thermomechanical responses of composite materials. MAC/GMC utilizes the generalized method of cells (GMC) — a model of micromechanics that predicts macroscopic responses of a composite material as functions of the properties, sizes, shapes, and responses of its constituents (e.g., matrix and fibers). The accuracy of the GMC is limited by neglect of coupling between normal and shear stresses. The HFGMC was developed by combining elements of the GMC and a related model, denoted the higher-order theory for functionally graded materials (HOTFGM), that can account for this coupling. Hence, the HFGMC enables simulation of stress and strain with greater accuracy. Some alterations of the MAC/GMC data structure were necessitated by the greater computational complexity of the HFGMC.

Posted in: Briefs, TSP

Read More >>

Architecture for Control of the K9 Rover

Software featuring a multilevel architecture is used to control the hardware on the K9 Rover, which is a mobile robot used in research on robots for scientific exploration and autonomous operation in general. The software consists of five types of modules: Device Drivers — These modules, at the lowest level of the architecture, directly control motors, cameras, data buses, and other hardware devices. Resource Managers — Each of these modules controls several device drivers. Resource managers can be commanded by either a remote operator or the pilot or conditional-executive modules described below. Behaviors and Data Processors — These modules perform computations for such functions as planning paths, avoiding obstacles, visual tracking, and stereoscopy. These modules can be commanded only by the pilot. Pilot — The pilot receives a possibly complex command from the remote operator or the conditional executive, then decomposes the command into (1) more-specific commands to the resource managers and (2) requests for information from the behaviors and data processors. Conditional Executive — This highest-level module interprets a command plan sent by the remote operator, determines whether resources required for execution of the plan are available, monitors execution, and, if necessary, selects an alternate branch of the plan.

Posted in: Briefs, TSP

Read More >>

Satellite Image Mosaic Engine

A computer program automatically builds large, full-resolution mosaics of multispectral images of Earth landmasses from images acquired by Landsat 7, complete with matching of colors and blending between adjacent scenes. While the code has been used extensively for Landsat, it could also be used for other data sources. A single mosaic of as many as 8,000 scenes, represented by more than 5 terabytes of data and the largest set produced in this work, demonstrated what the code could do to provide global coverage. The program first statistically analyzes input images to determine areas of coverage and data-value distributions. It then transforms the input images from their original universal transverse Mercator coordinates to other geographical coordinates, with scaling. It applies a first-order polynomial brightness correction to each band in each scene. It uses a data-mask image for selecting data and blending of input scenes. Under control by a user, the program can be made to operate on small parts of the output image space, with check-point and restart capabilities. The program runs on SGI IRIX computers. It is capable of parallel processing using shared-memory code, large memories, and tens of central processing units. It can retrieve input data and store output data at locations remote from the processors on which it is executed.

Posted in: Briefs, TSP

Read More >>

Utilizing AI in Temporal, Spatial, and Resource Scheduling

Aurora is a software system enabling the rapid, easy solution of complex scheduling problems involving spatial and temporal constraints among operations and scarce resources (such as equipment, workspace, and human experts). Although developed for use in the International Space Station Processing Facility, Aurora is flexible enough that it can be easily customized for application to other scheduling domains and adapted as the requirements change or become more precisely known over time. Aurora’s scheduling module utilizes artificial- intelligence (AI) techniques to make scheduling decisions on the basis of domain knowledge, including knowledge of constraints and their relative importance, interdependencies among operations, and possibly frequent changes in governing schedule requirements. Unlike many other scheduling software systems, Aurora focuses on resource requirements and temporal scheduling in combination. For example, Aurora can accommodate a domain requirement to schedule two subsequent operations to locations adjacent to a shared resource. The graphical interface allows the user to quickly visualize the schedule and perform changes reflecting additional knowledge or alterations in the situation. For example, the user might drag the activity corresponding to the start of operations to reflect a late delivery.

Posted in: Briefs

Read More >>

Montage Version 3.0

The final version (3.0) of the Montage software has been released. To recapitulate from previous NASA Tech Briefs articles about Montage: This software generates custom, science-grade mosaics of astronomical images on demand from input files that comply with the Flexible Image Transport System (FITS) standard and contain image data registered on projections that comply with the World Coordinate System (WCS) standards. This software can be executed on singleprocessor computers, multi-processor computers, and such networks of geographically dispersed computers as the National Science Foundation’s TeraGrid or NASA’s Information Power Grid. The primary advantage of running Montage in a grid environment is that computations can be done on a remote supercomputer for efficiency. Multiple computers at different sites can be used for different parts of a computation — a significant advantage in cases of computations for large mosaics that demand more processor time than is available at any one site. Version 3.0 incorporates several improvements over prior versions. The most significant improvement is that this version is accessible to scientists located anywhere, through operational Web services that provide access to data from several large astronomical surveys and construct mosaics on either local workstations or remote computational grids as needed.

Posted in: Briefs, TSP

Read More >>

Integrated System for Autonomous Science

The New Millennium Program Space Technology 6 Project Autonomous Sciencecraft software implements an integrated system for autonomous planning and execution of scientific, engineering, and spacecraft- coordination actions. A prior version of this software was reported in “The TechSat 21 Autonomous Sciencecraft Experiment” (NPO-30784), NASA Tech Briefs, Vol. 28, No. 3 (March 2004), page 33. This software is now in continuous use aboard the Earth Orbiter 1 (EO-1) spacecraft mission and is being adapted for use in the Mars Odyssey and Mars Exploration Rovers missions. This software enables EO-1 to detect and respond to such events of scientific interest as volcanic activity, flooding, and freezing and thawing of water. It uses classification algorithms to analyze imagery onboard to detect changes, including events of scientific interest. Detection of such events triggers acquisition of follow-up imagery. The mission-planning component of the software develops a response plan that accounts for visibility of targets and operational constraints. The plan is then executed under control by a task-execution component of the software that is capable of responding to anomalies.

Posted in: Briefs, TSP

Read More >>