Tech Briefs

“One-Stop Shopping” for Ocean Remote-Sensing and Model Data

OurOcean Portal 2.0 (http:// is a software system designed to enable users to easily gain access to ocean observation data, both remote-sensing and in-situ, configure and run an Ocean Model with observation data assimilated on a remote computer, and visualize both the observation data and the model outputs. At present, the observation data and models focus on the California coastal regions and Prince William Sound in Alaska. This system can be used to perform both real-time and retrospective analyses of remote-sensing data and model outputs. OurOcean Portal 2.0 incorporates state-of-the-art information technologies (IT) such as MySQL database, Java Web Server (Apache/Tomcat), Live Access Server (LAS), interactive graphics with Java Applet at the Client site and MatLab/GMT at the server site, and distributed computing. OurOcean currently serves over 20 real-time or historical ocean data products. The data are served in pre-generated plots or their native data format. For some of the datasets, users can choose different plotting parameters and produce customized graphics. OurOcean also serves 3D Ocean Model outputs generated by ROMS (Regional Ocean Model System) using LAS. The Live Access Server (LAS) software, developed by the Pacific Marine Environmental Laboratory (PMEL) of the National Oceanic and Atmospheric Administration (NOAA), is a configurable Web-server program designed to provide flexible access to geo-referenced scientific data. The model output can be views as plots in horizontal slices, depth profiles or time sequences, or can be downloaded as raw data in different data formats, such as NetCDF, ASCII, Binary, etc. The interactive visualization is provided by graphic software, Ferret, also developed by PMEL. In addition, OurOcean allows users with minimal computing resources to configure and run an Ocean Model with data assimilation on a remote computer. Users may select the forcing input, the data to be assimilated, the simulation period, and the output variables and submit the model to run on a backend parallel computer. When the run is complete, the output will be added to the LAS server for user to retrieve and examine the results.

Posted in: Briefs, TSP, Software, Computer simulation, Remote sensing, Data management, Marine vehicles and equipment


State Analysis Database Tool

The State Analysis Database Tool software establishes a productive environment for collaboration among software and system engineers engaged in the development of complex interacting systems. The tool embodies State Analysis, a model-based system engineering methodology founded on a state-based control architecture (see figure). A state represents a momentary condition of an evolving system, and a model may describe how a state evolves and is affected by other states. The State Analysis methodology is a process for capturing system and software requirements in the form of explicit models and states, and defining goal-based operational plans consistent with the models. Requirements, models, and operational concerns have traditionally been documented in a variety of system engineering artifacts that address different aspects of a mission’s lifecycle. In State Analysis, requirements, models, and operations information are State Analysis artifacts that are consistent and stored in a State Analysis Database. The tool includes a back-end database, a multi-platform front-end client, and Web-based administrative functions. The tool is structured to prompt an engineer to follow the State Analysis methodology, to encourage state discovery and model description, and to make software requirements and operations plans consistent with model descriptions.The State-Based Control Architecture is the foundation of the present software.

Posted in: Briefs, TSP, Software, Architecture, Computer software and hardware, Collaboration and partnering, Systems engineering


Generating CAHV and CAHVOR Images With Shadows in ROAMS

Part of the Rover Analysis, Modeling and Simulation (ROAMS) software that synthesizes images of terrain has been augmented to make the images more realistic. [ROAMS was described in “Simulating Operation of a Planetary Rover” (NPO-30722), NASA Tech Briefs, Vol. 28, No. 9 (September 2004), page 52. ROAMS simulates the operation of a robotic vehicle (rover) exploring terrain on a remote planet.] The images are needed for modeling responses of rover cameras that provide sensory inputs for machine-vision-based algorithms for controlling the motion of the rover. The augmented image-synthesizing part of the ROAMS software supports terrain geometry and texture specifiable by the user, CAHV and CAHVOR camera models, and more-realistic shadowing (see figure). (The letters in “CAHV” represent vectors in a standard photogrammetric model of a pinhole camera. Letters O and R in “CAHVOR” represent vectors used to model distortions.) A contemplated future version of ROAMS would support the CAHVORE model, which represents more-general cameras, including those having fish-eye or other wide-field-of-view lenses. (Letter E in “CAHVORE” represents a vector used to model apparent motion of a camera entrance pupil.)Examples of Shadowing show terrain and rovershadows. Pixels that do not have direct line-ofsight to the Sun are darkened.

Posted in: Briefs, TSP, Software, CAD, CAM, and CAE, Optics, Terrain, Spacecraft


Improving UDP/IP Transmission Without Increasing Congestion

Datagram Retransmission (DGR) is a computer program that, within certain limits, ensures the reception of each datagram transmitted under the User Datagram Protocol/Internet Protocol. [User Datagram Protocol (UDP) is considered unreliable because it does not involve a reliability- ensuring connection-initiation dialogue between sender and receiver. UDP is well suited to issuing of many small messages to many different receivers.] Unlike prior software for ensuring reception of UDP datagrams, DGR does not contribute to network congestion by retransmitting data more frequently as an ever-increasing number of messages and acknowledgements is lost. Instead, DGR does just the opposite: DGR includes an adaptive timeout- interval-computing component that provides maximum opportunity for reception of acknowledgements, minimizing retransmission. By monitoring changes in the rate at which message-transmission transactions are completed, DGR detects changes in the level of congestion and responds by imposing varying degrees of delay on the transmission of new messages. In addition, DGR maximizes throughput by not waiting for acknowledgement of a message before sending the next message. All DGR communication is asynchronous, to maximize efficient utilization of network connections. DGR manages multiple concurrent datagram transmission and acknowledgement conversations.

Posted in: Briefs, TSP, Software, Adaptive control, Communication protocols, Data exchange, Internet


FORTRAN Versions of Reformulated HFGMC Codes

Several FORTRAN codes have been written to implement the reformulated version of the high-fidelity generalized method of cells (HFGMC). Various aspects of the HFGMC and its predecessors were described in several prior NASA Tech Briefs articles, the most recent being “HFGMC Enhancement of MAC/GMC” (LEW-17818-1), NASA Tech Briefs, Vol. 30, No. 3 (March 2006), page 34. The HFGMC is a mathematical model of micromechanics for simulating stress and strain responses of fiber/matrix and other composite materials. The HFGMC overcomes a major limitation of a prior version of the GMC by accounting for coupling of shear and normal stresses and thereby affords greater accuracy, albeit at a large computational cost. In the reformulation of the HFGMC, the issue of computational efficiency was addressed: as a result, codes that implement the reformulated HFGMC complete their calculations about 10 times as fast as do those that implement the HFGMC. The present FORTRAN implementations of the reformulated HFGMC were written to satisfy a need for compatibility with other FORTRAN programs used to analyze structures and composite materials. The FORTRAN implementations also afford capabilities, beyond those of the basic HFGMC, for modeling inelasticity, fiber/matrix debonding, and coupled thermal, mechanical, piezo, and electromagnetic effects.

Posted in: Briefs, TSP, Software, Computer simulation, Mathematical models, Composite materials, Fatigue


Program for Editing Spacecraft Command Sequences

Sequence Translator, Editor, and Expander Resource (STEER) is a computer program that facilitates construction of sequences and blocks of sequences (hereafter denoted generally as sequence products) for commanding a spacecraft. STEER also provides mechanisms for translating among various sequence product types and quickly expanding activities of a given sequence in chronological order for review and analysis of the sequence. To date, construction of sequence products has generally been done by use of such clumsy mechanisms as text-editor programs, translating among sequence product types has been challenging, and expanding sequences to time-ordered lists has involved arduous processes of converting sequence products to “real” sequences and running them through Class-A software (defined, loosely, as flight and ground software critical to a spacecraft mission). Also, heretofore, generating sequence products in standard formats has been troublesome because precise formatting and syntax are required. STEER alleviates these issues by providing a graphical user interface containing intuitive fields in which the user can enter the necessary information. The STEER expansion function provides a “quick and dirty” means of seeing how a sequence and sequence block would expand into a chronological list, without need to use of Class-A software.

Posted in: Briefs, TSP, Software, Computer software and hardware, Flight control systems, Human machine interface (HMI), Spacecraft


Flight-Tested Prototype of BEAM Software

software prototype of BEAM (Beaconbased Exception Analysis for Multimissions) and successfully tested its operation in flight onboard a NASA research aircraft. BEAM (see NASA Tech Briefs, Vol. 26, No. 9; and Vol. 27, No. 3) is an ISHM (Integrated Systems Health Management) technology that automatically analyzes sensor data and classifies system behavior as either nominal or anomalous, and further characterizes anomalies according to strength, duration, and affected signals. BEAM (see figure) can be used to monitor a wide variety of physical systems and sensor types in real time. In this series of tests, BEAM monitored the engines of a Dryden Flight Research Center F-18 aircraft, and performed onboard, unattended analysis of 26 engine sensors from engine startup to shutdown. The BEAM algorithm can detect anomalies based solely on the sensor data, which includes but is not limited to sensor failure, performance degradation, incorrect operation such as unplanned engine shutdown or flameout in this example, and major system faults. BEAM was tested on an F- 18 simulator, static engine tests, and 25 individual flights totaling approximately 60 hours of flight time. During these tests, BEAM successfully identified planned anomalies (in-flight shutdowns of one engine) as well as minor unplanned anomalies (e.g., transient oiland fuel-pressure drops), with no false alarms or suspected false-negative results for the period tested. BEAM also detected previously unknown behavior in the F-18 compressor section during several flights. This result, confirmed by direct analysis of the raw data, serves as a significant test of BEAM’s capability.Top-Level BEAM Architecture is used for monitoring physical systems in real time.

Posted in: Briefs, TSP, Software, Sensors and actuators, Vehicle health management, Flight tests, Commercial aircraft


The U.S. Government does not endorse any commercial product, process, or activity identified on this web site.