Core Technical Capability Laboratory Management System

The Core Technical Capability Lab - oratory Management System (CTCLMS) consists of dynamically generated Web pages used to access a database containing detailed CTC lab data with the software hosted on a server that allows users to have remote access. Users log into the system with their KSC (or other domain) username and password. They are authenticated within that domain and their CTCLMS user privileges are then authenticated within the system. Based on the different user’s privileges (roles), menu options are displayed. CTCLMS users are assigned roles such as Lab Member, Lab Manager, Natural Neighbor Integration Manager, Organ - izational Manager, CTC Program Manager, or Administrator. The role assigned determines the users’ capabilities within the system. Users navigate the menu to view, edit, modify or delete laboratory and equipment data, generate financial and managerial reports, and perform other CTC lab-related functions and analyses.

Posted in: Briefs, TSP, Information Sciences, Architecture, Computer software and hardware, Internet, Data management, Logistics
Read More >>

MRO SOW Daily Script

The MRO SOW daily script (wherein “MRO” signifies “Mars Reconnaissance Orbiter” and “SOW” signifies “sequence systems engineer of the week”) is a computer program that automates portions of the MRO daily SOW procedure, which includes checking file-system sizes and automated sequence processor (ASP) log files. The MRO SOW daily script effects clear reporting of (1) the status of, and requirements imposed on, the file system and (2) the ASP log files.

Posted in: Briefs, Information Sciences, Artificial intelligence, Computer software and hardware, Documentation
Read More >>

Object Recognition Using Feature-and Color-Based Methods

The combination of methods works better than does either method alone.

An improved adaptive method of processing image data in an artificial neural network has been developed to enable automated, real-time recognition of possibly moving objects under changing (including suddenly changing) conditions of illumination and perspective. The method involves a combination of two prior object-recognition methods — one based on adaptive detection of shape features and one based on adaptive color segmentation — to enable recognition in situations in which either prior method by itself may be inadequate.

Posted in: Briefs, TSP, Information Sciences, Imaging and visualization, Neural networks
Read More >>

Root Source Analysis/ValuStream™ — a Methodology for Identifying and Managing Risks

Root sources of uncertainty are taken into account in a rigorous, systematic way.

Root Source Analysis (RoSA) is a systems-engineering methodology that has been developed at NASA over the past five years. It is designed to reduce costs, schedule, and technical risks by systematically examining critical assumptions and the state of the knowledge needed to bring to fruition the products that satisfy mission-driven requirements, as defined for each element of the Work (or Product) Breakdown Structure (WBS or PBS). This methodology is sometimes referred to as the ValuStream method, as inherent in the process is the linking and prioritizing of uncertainties arising from knowledge shortfalls directly to the customer’s mission driven requirements. RoSA and ValuStream are synonymous terms.

Posted in: Briefs, Information Sciences, Risk management
Read More >>

Ensemble: an Architecture for Mission-Operations Software

Several issues are addressed by capitalizing on the Eclipse open-source software framework.

“Ensemble” is the name of an open architecture for, and a methodology for the development of, spacecraft mission-operations software. Ensemble is also potentially applicable to the development of non-spacecraft mission-operations-type software.

Posted in: Briefs, TSP, Information Sciences, Architecture, Computer software and hardware, Flight management systems, Spacecraft
Read More >>

Toward Better Modeling of Supercritical Turbulent Mixing

A study was done as part of an effort to develop computational models representing turbulent mixing under thermodynamic supercritical (here, high pressure) conditions. The question was whether the large-eddy simulation (LES) approach, developed previously for atmospheric-pressure compressible-perfect- gas and incompressible flows, can be extended to real-gas non-ideal (including supercritical) fluid mixtures. [In LES, the governing equations are approximated such that the flow field is spatially filtered and sub-grid-scale (SGS) phenomena are represented by models.] The study included analyses of results from direct numerical simulation (DNS) of several such mixing layers based on the Navier-Stokes, total-energy, and conservation-of-chemical-species governing equations.

Posted in: Briefs, Information Sciences, Simulation and modeling, Thermodynamics, Gases
Read More >>

JPEG 2000 Encoding With Perceptual Distortion Control

The bit rate for a given level of perceptual distortion is minimized.

An alternative approach has been devised for encoding image data in compliance with JPEG 2000, the most recent still-image data- compression standard of the Joint Photographic Experts Group. Heretofore, JPEG 2000 encoding has been implemented by several related schemes classified as rate-based distortion-minimization encoding. In each of these schemes, the end user specifies a desired bit rate and the encoding algorithm strives to attain that rate while minimizing a mean squared error (MSE). While rate-based distortion minimization is appropriate for transmitting data over a limited-bandwidth channel, it is not the best approach for applications in which the perceptual quality of reconstructed images is a major consideration. A better approach for such applications is the present alternative one, denoted perceptual distortion control, in which the encoding algorithm strives to compress data to the lowest bit rate that yields at least a specified level of perceptual image quality.

Posted in: Briefs, TSP, Information Sciences, Data acquisition and handling, Imaging and visualization
Read More >>

Intelligent Integrated Health Management for a System of Systems

Intelligent elements exchange information and each determines its own condition.

An intelligent integrated health management system (IIHMS) incorporates major improvements over prior such systems. The particular IIHMS is implemented for any system defined as a hierarchical distributed network of intelligent elements (HDNIE), comprising primarily: (1) an architecture (Figure 1), (2) intelligent elements, (3) a conceptual framework and taxonomy (Figure 2), and (4) and ontology that defines standards and protocols.

Posted in: Briefs, Information Sciences, Architecture, Communication protocols, Computer software and hardware, Systems management
Read More >>

Delay Banking for Managing Air Traffic

Delay credits could be expended to gain partial relief from flow restrictions.

Delay banking has been invented to enhance air-traffic management in a way that would increase the degree of fairness in assigning arrival, departure, and en-route delays and trajectory deviations to aircraft impacted by congestion in the national airspace system. In delay banking, an aircraft operator (airline, military, general aviation, etc.) would be assigned a numerical credit when any of their flights are delayed because of an air-traffic flow restriction. The operator could subsequently bid against other operators competing for access to congested airspace to utilize part or all of its accumulated credit. Operators utilize credits to obtain higher priority for the same flight, or other flights operating at the same time, or later, in the same airspace, or elsewhere. Operators could also trade delay credits, according to market rules that would be determined by stakeholders in the national airspace system.

Posted in: Briefs, Information Sciences, Air traffic control
Read More >>

Spline-Based Smoothing of Airfoil Curvatures

Spurious curvature oscillations and bumps are suppressed.

Constrained fitting for airfoil curvature smoothing (CFACS) is a spline-based method of interpolating airfoil surface coordinates (and, concomitantly, airfoil thicknesses) between specified discrete design points so as to obtain smoothing of surface-curvature profiles in addition to basic smoothing of surfaces. CFACS was developed in recognition of the fact that the performance of a transonic airfoil is directly related to both the curvature profile and the smoothness of the airfoil surface.

Posted in: Briefs, TSP, Information Sciences, Wings, Aerodynamics
Read More >>

The U.S. Government does not endorse any commercial product, process, or activity identified on this web site.