Special Coverage

Active Aircraft Pylon Noise Control System
Unmanned Aerial Systems Traffic Management
Method of Bonding Dissimilar Materials
Sonar Inspection Robot System
Applying the Dynamic Inertia Measurement Method to Full-Scale Aerospace Vehicles
Method and Apparatus for Measuring Surface Air Pressure
Fully Premixed, Low-Emission, High-Pressure, Multi-Fuel Burner
Self-Healing Wire Insulation
Home

Automated Vectorization of Decision-Based Algorithms

Virtually all existing vectorization algorithms are designed to only analyze the numeric properties of an algorithm and distribute those elements across multiple processors. This advances the state of the practice because it is the only known system, at the time of this reporting, that takes high-level statements and analyzes them for their decision properties and converts them to a form that allows them to automatically be executed in parallel. The software takes a high-level source program that describes a complex decision-based condition and rewrites it as a disjunctive set of component Boolean relations that can then be executed in parallel. This is important because parallel architectures are becoming more commonplace in conventional systems and they have always been present in NASA flight systems. This technology allows one to take existing condition-based code and automatically vectorize it so it naturally decomposes across parallel architectures.

Posted in: Briefs, TSP, Software

Read More >>

Grayscale Optical Correlator Workbench

Grayscale Optical Correlator Workbench (GOCWB) is a computer program for use in automatic target recognition (ATR). GOCWB performs ATR with an accurate simulation of a hardware grayscale optical correlator (GOC). This simulation is performed to test filters that are created in GOCWB. Thus, GOCWB can be used as a stand-alone ATR software tool or in combination with GOC hardware for building (target training), testing, and optimization of filters. The software is divided into three main parts, denoted filter, testing, and training. The training part is used for assembling training images as input to a filter. The filter part is used for combining training images into a filter and optimizing that filter. The testing part is used for testing new filters and for general simulation of GOC output. The current version of GOCWB relies on the mathematical software tools from MATLAB binaries for performing matrix operations and fast Fourier transforms. Optimization of filters is based on an algorithm, known as OT-MACH, in which variables specified by the user are parameterized and the best filter is selected on the basis of an average result for correct identification of targets in multiple test images.

Posted in: Briefs, TSP, Software

Read More >>

“One-Stop Shopping” for Ocean Remote-Sensing and Model Data

OurOcean Portal 2.0 (http:// ourocean.jpl.nasa.gov) is a software system designed to enable users to easily gain access to ocean observation data, both remote-sensing and in-situ, configure and run an Ocean Model with observation data assimilated on a remote computer, and visualize both the observation data and the model outputs. At present, the observation data and models focus on the California coastal regions and Prince William Sound in Alaska. This system can be used to perform both real-time and retrospective analyses of remote-sensing data and model outputs. OurOcean Portal 2.0 incorporates state-of-the-art information technologies (IT) such as MySQL database, Java Web Server (Apache/Tomcat), Live Access Server (LAS), interactive graphics with Java Applet at the Client site and MatLab/GMT at the server site, and distributed computing. OurOcean currently serves over 20 real-time or historical ocean data products. The data are served in pre-generated plots or their native data format. For some of the datasets, users can choose different plotting parameters and produce customized graphics. OurOcean also serves 3D Ocean Model outputs generated by ROMS (Regional Ocean Model System) using LAS. The Live Access Server (LAS) software, developed by the Pacific Marine Environmental Laboratory (PMEL) of the National Oceanic and Atmospheric Administration (NOAA), is a configurable Web-server program designed to provide flexible access to geo-referenced scientific data. The model output can be views as plots in horizontal slices, depth profiles or time sequences, or can be downloaded as raw data in different data formats, such as NetCDF, ASCII, Binary, etc. The interactive visualization is provided by graphic software, Ferret, also developed by PMEL. In addition, OurOcean allows users with minimal computing resources to configure and run an Ocean Model with data assimilation on a remote computer. Users may select the forcing input, the data to be assimilated, the simulation period, and the output variables and submit the model to run on a backend parallel computer. When the run is complete, the output will be added to the LAS server for user to retrieve and examine the results.

Posted in: Briefs, TSP, Software

Read More >>

State Analysis Database Tool

The State Analysis Database Tool software establishes a productive environment for collaboration among software and system engineers engaged in the development of complex interacting systems. The tool embodies State Analysis, a model-based system engineering methodology founded on a state-based control architecture (see figure). A state represents a momentary condition of an evolving system, and a model may describe how a state evolves and is affected by other states. The State Analysis methodology is a process for capturing system and software requirements in the form of explicit models and states, and defining goal-based operational plans consistent with the models. Requirements, models, and operational concerns have traditionally been documented in a variety of system engineering artifacts that address different aspects of a mission’s lifecycle. In State Analysis, requirements, models, and operations information are State Analysis artifacts that are consistent and stored in a State Analysis Database. The tool includes a back-end database, a multi-platform front-end client, and Web-based administrative functions. The tool is structured to prompt an engineer to follow the State Analysis methodology, to encourage state discovery and model description, and to make software requirements and operations plans consistent with model descriptions.The State-Based Control Architecture is the foundation of the present software.

Posted in: Briefs, TSP, Software

Read More >>

Generating CAHV and CAHVOR Images With Shadows in ROAMS

Part of the Rover Analysis, Modeling and Simulation (ROAMS) software that synthesizes images of terrain has been augmented to make the images more realistic. [ROAMS was described in “Simulating Operation of a Planetary Rover” (NPO-30722), NASA Tech Briefs, Vol. 28, No. 9 (September 2004), page 52. ROAMS simulates the operation of a robotic vehicle (rover) exploring terrain on a remote planet.] The images are needed for modeling responses of rover cameras that provide sensory inputs for machine-vision-based algorithms for controlling the motion of the rover. The augmented image-synthesizing part of the ROAMS software supports terrain geometry and texture specifiable by the user, CAHV and CAHVOR camera models, and more-realistic shadowing (see figure). (The letters in “CAHV” represent vectors in a standard photogrammetric model of a pinhole camera. Letters O and R in “CAHVOR” represent vectors used to model distortions.) A contemplated future version of ROAMS would support the CAHVORE model, which represents more-general cameras, including those having fish-eye or other wide-field-of-view lenses. (Letter E in “CAHVORE” represents a vector used to model apparent motion of a camera entrance pupil.)Examples of Shadowing show terrain and rovershadows. Pixels that do not have direct line-ofsight to the Sun are darkened.

Posted in: Briefs, TSP, Software

Read More >>

Improving UDP/IP Transmission Without Increasing Congestion

Datagram Retransmission (DGR) is a computer program that, within certain limits, ensures the reception of each datagram transmitted under the User Datagram Protocol/Internet Protocol. [User Datagram Protocol (UDP) is considered unreliable because it does not involve a reliability- ensuring connection-initiation dialogue between sender and receiver. UDP is well suited to issuing of many small messages to many different receivers.] Unlike prior software for ensuring reception of UDP datagrams, DGR does not contribute to network congestion by retransmitting data more frequently as an ever-increasing number of messages and acknowledgements is lost. Instead, DGR does just the opposite: DGR includes an adaptive timeout- interval-computing component that provides maximum opportunity for reception of acknowledgements, minimizing retransmission. By monitoring changes in the rate at which message-transmission transactions are completed, DGR detects changes in the level of congestion and responds by imposing varying degrees of delay on the transmission of new messages. In addition, DGR maximizes throughput by not waiting for acknowledgement of a message before sending the next message. All DGR communication is asynchronous, to maximize efficient utilization of network connections. DGR manages multiple concurrent datagram transmission and acknowledgement conversations.

Posted in: Briefs, TSP, Software

Read More >>

FORTRAN Versions of Reformulated HFGMC Codes

Several FORTRAN codes have been written to implement the reformulated version of the high-fidelity generalized method of cells (HFGMC). Various aspects of the HFGMC and its predecessors were described in several prior NASA Tech Briefs articles, the most recent being “HFGMC Enhancement of MAC/GMC” (LEW-17818-1), NASA Tech Briefs, Vol. 30, No. 3 (March 2006), page 34. The HFGMC is a mathematical model of micromechanics for simulating stress and strain responses of fiber/matrix and other composite materials. The HFGMC overcomes a major limitation of a prior version of the GMC by accounting for coupling of shear and normal stresses and thereby affords greater accuracy, albeit at a large computational cost. In the reformulation of the HFGMC, the issue of computational efficiency was addressed: as a result, codes that implement the reformulated HFGMC complete their calculations about 10 times as fast as do those that implement the HFGMC. The present FORTRAN implementations of the reformulated HFGMC were written to satisfy a need for compatibility with other FORTRAN programs used to analyze structures and composite materials. The FORTRAN implementations also afford capabilities, beyond those of the basic HFGMC, for modeling inelasticity, fiber/matrix debonding, and coupled thermal, mechanical, piezo, and electromagnetic effects.

Posted in: Briefs, TSP, Software

Read More >>

The U.S. Government does not endorse any commercial product, process, or activity identified on this web site.