Core Technical Capability Laboratory Management System

The Core Technical Capability Lab - oratory Management System (CTCLMS) consists of dynamically generated Web pages used to access a database containing detailed CTC lab data with the software hosted on a server that allows users to have remote access. Users log into the system with their KSC (or other domain) username and password. They are authenticated within that domain and their CTCLMS user privileges are then authenticated within the system. Based on the different user’s privileges (roles), menu options are displayed. CTCLMS users are assigned roles such as Lab Member, Lab Manager, Natural Neighbor Integration Manager, Organ - izational Manager, CTC Program Manager, or Administrator. The role assigned determines the users’ capabilities within the system. Users navigate the menu to view, edit, modify or delete laboratory and equipment data, generate financial and managerial reports, and perform other CTC lab-related functions and analyses.

Posted in: Briefs, TSP, Information Sciences


MRO SOW Daily Script

The MRO SOW daily script (wherein “MRO” signifies “Mars Reconnaissance Orbiter” and “SOW” signifies “sequence systems engineer of the week”) is a computer program that automates portions of the MRO daily SOW procedure, which includes checking file-system sizes and automated sequence processor (ASP) log files. The MRO SOW daily script effects clear reporting of (1) the status of, and requirements imposed on, the file system and (2) the ASP log files.

Posted in: Briefs, Information Sciences, Artificial intelligence, Computer software and hardware, Documentation


Object Recognition Using Feature-and Color-Based Methods

The combination of methods works better than does either method alone. An improved adaptive method of processing image data in an artificial neural network has been developed to enable automated, real-time recognition of possibly moving objects under changing (including suddenly changing) conditions of illumination and perspective. The method involves a combination of two prior object-recognition methods — one based on adaptive detection of shape features and one based on adaptive color segmentation — to enable recognition in situations in which either prior method by itself may be inadequate.

Posted in: Briefs, TSP, Information Sciences, Imaging and visualization, Neural networks


Root Source Analysis/ValuStream™ — a Methodology for Identifying and Managing Risks

Root sources of uncertainty are taken into account in a rigorous, systematic way. Root Source Analysis (RoSA) is a systems-engineering methodology that has been developed at NASA over the past five years. It is designed to reduce costs, schedule, and technical risks by systematically examining critical assumptions and the state of the knowledge needed to bring to fruition the products that satisfy mission-driven requirements, as defined for each element of the Work (or Product) Breakdown Structure (WBS or PBS). This methodology is sometimes referred to as the ValuStream method, as inherent in the process is the linking and prioritizing of uncertainties arising from knowledge shortfalls directly to the customer’s mission driven requirements. RoSA and ValuStream are synonymous terms.

Posted in: Briefs, Information Sciences, Risk management


Ensemble: an Architecture for Mission-Operations Software

Several issues are addressed by capitalizing on the Eclipse open-source software framework. “Ensemble” is the name of an open architecture for, and a methodology for the development of, spacecraft mission-operations software. Ensemble is also potentially applicable to the development of non-spacecraft mission-operations-type software.

Posted in: Briefs, TSP, Information Sciences, Architecture, Computer software and hardware, Flight management systems, Spacecraft


Toward Better Modeling of Supercritical Turbulent Mixing

A study was done as part of an effort to develop computational models representing turbulent mixing under thermodynamic supercritical (here, high pressure) conditions. The question was whether the large-eddy simulation (LES) approach, developed previously for atmospheric-pressure compressible-perfect- gas and incompressible flows, can be extended to real-gas non-ideal (including supercritical) fluid mixtures. [In LES, the governing equations are approximated such that the flow field is spatially filtered and sub-grid-scale (SGS) phenomena are represented by models.] The study included analyses of results from direct numerical simulation (DNS) of several such mixing layers based on the Navier-Stokes, total-energy, and conservation-of-chemical-species governing equations.

Posted in: Briefs, Information Sciences, Simulation and modeling, Thermodynamics, Gases


JPEG 2000 Encoding With Perceptual Distortion Control

The bit rate for a given level of perceptual distortion is minimized. An alternative approach has been devised for encoding image data in compliance with JPEG 2000, the most recent still-image data- compression standard of the Joint Photographic Experts Group. Heretofore, JPEG 2000 encoding has been implemented by several related schemes classified as rate-based distortion-minimization encoding. In each of these schemes, the end user specifies a desired bit rate and the encoding algorithm strives to attain that rate while minimizing a mean squared error (MSE). While rate-based distortion minimization is appropriate for transmitting data over a limited-bandwidth channel, it is not the best approach for applications in which the perceptual quality of reconstructed images is a major consideration. A better approach for such applications is the present alternative one, denoted perceptual distortion control, in which the encoding algorithm strives to compress data to the lowest bit rate that yields at least a specified level of perceptual image quality.

Posted in: Briefs, TSP, Information Sciences, Data acquisition and handling, Imaging and visualization


The U.S. Government does not endorse any commercial product, process, or activity identified on this web site.