Information Technology & Software

IIR Filters for Postprocessing Noisy Test Data

A little-known parametric form of an infinite impulse response (IIR) filter has been found to be useful in digital postprocessing of noisy test signals. The filter equation is:

y(n) = α[x(n) + x(n – 1)] + γy(n – 1),

Posted in: Briefs, TSP, Information Sciences, Data acquisition and handling, Data acquisition and handling

The Complexity of the Diagnosis Problem

A report presents a study of the complexity of an algorithm that performs model-based diagnosis of a complex hardware system. [In model-based diagnosis, an algorithm detects logical inconsistencies between observational data and a description (mathematical model) of the system.] In the study, the problem of detecting logical inconsistencies is transformed into the problem of finding prime implicants of a monotone Boolean function. This transformation enables utilization of the welldeveloped machinery of Boolean function theory, not directly accessible in the logical approach: one can work with monotone Boolean functions described by polynomial-size monotone circuits instead of attempting to deal with logical objects and performing exhaustive searches in order to extract all desired information. One especially notable result achieved in this study through the Boolean-function approach is the first analytical proof that the diagnosis problem is NP-complete. The report asserts that the discovery of the connection between diagnosis and the Boolean functions may afford new means to solve the diagnosis problem — in particular, to develop diagnostic algorithms that take super-polynomial amounts of time, in contrast to the exponential amounts of time heretofore needed to solve NP-complete problems.

Posted in: Briefs, TSP, Information Sciences, Mathematical models, On-board diagnostics, On-board diagnostics (OBD), On-board diagnostics, On-board diagnostics (OBD), Vehicle health management

Methodology for Tracking Hazards and Predicting Failures

This methodology can increase safety and reliability while reducing costs in all industries.

The Continuous Hazard Tracking and Failure Prediction Methodology (CHTFPM) is a proactive methodology for gathering and analyzing information about a system in order to prevent accidents and system failures. The proactivity of the CHTFPM places it in contrast to conventional formal inductive and deductive hazard-analysis methodologies, which are limited in their effectiveness, in that they do not provide realtime information on whether the conditions in a system are becoming hazardous and could lead to a system malfunction: the conventional methodologies basically provide feedback on hazards after accidents have happened. The CHTFPM could be applied to advantage in almost all industries.

Posted in: Briefs, TSP, Information Sciences, Failure analysis, Prognostics, Hazards and emergency management, Hazards and emergency operations

Estimating Heterodyne-Interferometer Polarization Leakage

Correction for the nonlinearity contributed by polarization leakage can be made in real time.

A method of estimating and correcting for the effect of polarization leakage on the response of a heterodyne optical interferometer has been devised. In a typical application in which a heterodyne interferometer is used as a displacement or length gauge, the effect of the polarization leakage is a nonlinearity that typically gives rise to an error of the order of 1 nm in the displacement or length. By use of the present method, it should eventually be possible, in principle, to reduce the error to the order of 10 pm or less. The technique is primarily computational and does not require any additional interferometer hardware. Moreover, the computations can be performed on almost any modern computer in real time.

Posted in: Briefs, TSP, Information Sciences, Test equipment and instrumentation

An Efficient Algorithm for Propagation of Temporal-Constraint Networks

The computational cost is much less than in prior algorithms.

An efficient artificial-intelligence-type algorithm for the propagation of temporal constraints has been devised for incorporation into software that performs scheduling and planning of tasks in real time. This algorithm checks for temporal consistency and computes time windows of time points within temporal-constraint networks, which are often used in scheduling and planning. A C++-language computer program that implements the algorithm has been devised for incorporation into the control software of the Mission Data System of NASA’s Jet Propulsion Laboratory. The algorithm and program could also be applied to industrial planning and scheduling problems.

Posted in: Briefs, TSP, Information Sciences, Artificial intelligence, Computer software / hardware, Computer software and hardware, Artificial intelligence, Computer software / hardware, Computer software and hardware

Software for Continuous Replanning During Execution

Feedback from execution of a plan is used to update the plan continuously.

Continuous Activity Scheduling Planning Execution and Replanning (CASPER) is a computer program for automated planning of interdependent activities within a system subject to requirements, constraints, and limitations on resources. Now at the prototype stage of development, CASPER was conceived to enable a robotic exploratory spacecraft to perform onboard, autonomous planning and replanning of scientific observations and other functions in response to diverse unanticipated phenomena that could include unknown or changing environmental conditions, equipment failures, and errors in mathematical models used in planning. On Earth, CASPER could be adapted to use in scheduling operations of production lines and other complex systems.

Posted in: Briefs, TSP, Information Sciences, Mathematical models, Computer software / hardware, Computer software and hardware, Computer software / hardware, Computer software and hardware

Compensating for Motion Errors in UWB SAR Data

Processing is implemented in two stages by a computationally efficient algorithm.

A method of processing data acquired by ultra-wide-band (UWB) syntheticaperture radar (SAR) provides for suppression of those errors that are caused by the undesired relative motion of the radar platform and the targets. This method involves, among other things, processing of data in the wave-number or frequency domain and the application of motion compensation as a function of the positions of a target relative to the radar platform.

Posted in: Briefs, TSP, Information Sciences, Mathematical models, Radar, Radar, Data management

Maximum-Likelihood Template Matching

This algorithm features a robust measure of matching and an efficient search technique.

An improved algorithm for detecting gray-scale and binary templates in digitized images has been devised. The greatest difference between this algorithm and prior template-detecting algorithms stems from the measure used to determine the quality or degree of match between a template and given portion of an image. This measure is based on a maximum-likelihood formulation of the template- matching problem; this measure, and the matching performance obtained by use of it, are more robust than are those of prior template-matching algorithms, most of which utilize a sum-of-squared-differences measure. Other functions that the algorithm performs along with template matching include subpixel localization, estimation of uncertainty, and optimal selection of features. This algorithm is expected to be useful for detecting templates in digital images in a variety of applications, including recognition of objects, ranging by use of stereoscopic images, and tracking of moving objects or features. (For the purpose of tracking, features or objects recognized in an initial image could be used as templates for matching in subsequent images of the same scene.)

Posted in: Briefs, TSP, Information Sciences, Mathematical models, Imaging, Imaging and visualization, Imaging, Imaging and visualization, Performance upgrades

Fast Algorithms and Circuits for Quantum Wavelet Transforms

These theoretical building blocks could be used to implement a variety of quantum algorithms.

Fast algorithms and the first complete and efficient circuits for implementing two quantum wavelet transforms have been developed in theory. The significance of this development within the overall development of quantum computing is the following: In principle, the algorithms and circuits constitute instructions for implementing the transforms by use of primitive quantum gates; the circuits in this case are analogous to circuit-diagram-level descriptions of classical electronic circuits that perform logic functions.

Posted in: Briefs, TSP, Information Sciences, Mathematical models, Architecture, Integrated circuits, Architecture, Integrated circuits

Software for Analyzing Root Causes of Process Anomalies

Root Cause Analysis (RoCA) is a computer program that assists analysts in understanding the root causes of process anomalies. As used here, “process anomalies” includes incidents that have caused, or that can potentially cause, injuries to personnel, damage to facilities, abnormal costs, or delays in processing. RoCA could be used, for example, in industry to investigate anomalies in production and by government agencies and airlines in investigating airplane accidents. Older software developed to aid such investigations offers limited capabilities for mapping the contribution of each root cause to a given process anomaly. Unlike the prior software, RoCA not only identifies root causes of process anomalies but also supports the identification of trends over multiple anomalies.

Posted in: Briefs, TSP, Information Sciences, Analysis methodologies, Computer software and hardware, Risk assessments

The U.S. Government does not endorse any commercial product, process, or activity identified on this web site.