Home

Object Recognition Using Feature-and Color-Based Methods

The combination of methods works better than does either method alone. An improved adaptive method of processing image data in an artificial neural network has been developed to enable automated, real-time recognition of possibly moving objects under changing (including suddenly changing) conditions of illumination and perspective. The method involves a combination of two prior object-recognition methods — one based on adaptive detection of shape features and one based on adaptive color segmentation — to enable recognition in situations in which either prior method by itself may be inadequate.

Posted in: Briefs, TSP, Information Sciences

Read More >>

Root Source Analysis/ValuStream™ — a Methodology for Identifying and Managing Risks

Root sources of uncertainty are taken into account in a rigorous, systematic way. Root Source Analysis (RoSA) is a systems-engineering methodology that has been developed at NASA over the past five years. It is designed to reduce costs, schedule, and technical risks by systematically examining critical assumptions and the state of the knowledge needed to bring to fruition the products that satisfy mission-driven requirements, as defined for each element of the Work (or Product) Breakdown Structure (WBS or PBS). This methodology is sometimes referred to as the ValuStream method, as inherent in the process is the linking and prioritizing of uncertainties arising from knowledge shortfalls directly to the customer’s mission driven requirements. RoSA and ValuStream are synonymous terms.

Posted in: Briefs, Information Sciences

Read More >>

Ensemble: an Architecture for Mission-Operations Software

Several issues are addressed by capitalizing on the Eclipse open-source software framework. “Ensemble” is the name of an open architecture for, and a methodology for the development of, spacecraft mission-operations software. Ensemble is also potentially applicable to the development of non-spacecraft mission-operations-type software.

Posted in: Briefs, TSP, Information Sciences

Read More >>

Toward Better Modeling of Supercritical Turbulent Mixing

A study was done as part of an effort to develop computational models representing turbulent mixing under thermodynamic supercritical (here, high pressure) conditions. The question was whether the large-eddy simulation (LES) approach, developed previously for atmospheric-pressure compressible-perfect- gas and incompressible flows, can be extended to real-gas non-ideal (including supercritical) fluid mixtures. [In LES, the governing equations are approximated such that the flow field is spatially filtered and sub-grid-scale (SGS) phenomena are represented by models.] The study included analyses of results from direct numerical simulation (DNS) of several such mixing layers based on the Navier-Stokes, total-energy, and conservation-of-chemical-species governing equations.

Posted in: Briefs, Information Sciences

Read More >>

JPEG 2000 Encoding With Perceptual Distortion Control

The bit rate for a given level of perceptual distortion is minimized. An alternative approach has been devised for encoding image data in compliance with JPEG 2000, the most recent still-image data- compression standard of the Joint Photographic Experts Group. Heretofore, JPEG 2000 encoding has been implemented by several related schemes classified as rate-based distortion-minimization encoding. In each of these schemes, the end user specifies a desired bit rate and the encoding algorithm strives to attain that rate while minimizing a mean squared error (MSE). While rate-based distortion minimization is appropriate for transmitting data over a limited-bandwidth channel, it is not the best approach for applications in which the perceptual quality of reconstructed images is a major consideration. A better approach for such applications is the present alternative one, denoted perceptual distortion control, in which the encoding algorithm strives to compress data to the lowest bit rate that yields at least a specified level of perceptual image quality.

Posted in: Briefs, TSP, Information Sciences

Read More >>

Intelligent Integrated Health Management for a System of Systems

Intelligent elements exchange information and each determines its own condition. An intelligent integrated health management system (IIHMS) incorporates major improvements over prior such systems. The particular IIHMS is implemented for any system defined as a hierarchical distributed network of intelligent elements (HDNIE), comprising primarily: (1) an architecture (Figure 1), (2) intelligent elements, (3) a conceptual framework and taxonomy (Figure 2), and (4) and ontology that defines standards and protocols.

Posted in: Briefs, Information Sciences

Read More >>

Delay Banking for Managing Air Traffic

Delay credits could be expended to gain partial relief from flow restrictions. Delay banking has been invented to enhance air-traffic management in a way that would increase the degree of fairness in assigning arrival, departure, and en-route delays and trajectory deviations to aircraft impacted by congestion in the national airspace system. In delay banking, an aircraft operator (airline, military, general aviation, etc.) would be assigned a numerical credit when any of their flights are delayed because of an air-traffic flow restriction. The operator could subsequently bid against other operators competing for access to congested airspace to utilize part or all of its accumulated credit. Operators utilize credits to obtain higher priority for the same flight, or other flights operating at the same time, or later, in the same airspace, or elsewhere. Operators could also trade delay credits, according to market rules that would be determined by stakeholders in the national airspace system.

Posted in: Briefs, Information Sciences

Read More >>

The U.S. Government does not endorse any commercial product, process, or activity identified on this web site.