Home

Algorithm Optimally Orders Forward-Chaining Inference Rules

Requirements for exhaustive data-flow analysis are relaxed. People typically develop knowledge bases in a somewhat ad hoc manner by incrementally adding rules with no specific organization. This often results in a very inefficient execution of those rules since they are so often order sensitive. This is relevant to tasks like Deep Space Network in that it allows the knowledge base to be incrementally developed and have it automatically ordered for efficiency.

Posted in: Briefs, TSP, Information Sciences

Read More >>

Project Integration Architecture

All information of technological processes can be readily originated, manipulated, shared, propagated to other processes, and viewed by man or machine. The Project Integration Architecture (PIA) is a distributed, object- oriented, conceptual, software framework for the generation, organization, publication, integration, and consumption of all information involved in any complex technological process in a manner that is intelligible to both computers and humans. As used here, “all information” signifies, more specifically, all information that has been or could be coded in digital form. This includes not only experimental data, design data, results of simulations and analyses, organizational and financial data, and the like, but also sets of rules, computer programs, processes, and methods of solution.

Posted in: Briefs, TSP, Information Sciences

Read More >>

Reusable, Extensible High-Level Data-Distribution Concept

Users can optimize distributions for parallel computing, without concern for tedious details. A framework for high-level specification of data distributions in data-parallel application programs has been conceived. [As used here, “distributions” signifies means to express locality (more specifically, locations of specified pieces of data) in a computing system composed of many processor and memory components connected by a network.] Inasmuch as distributions exert a great effect on the performances of application programs, it is important that a distribution strategy be flexible, so that distributions can be adapted to the requirements of those programs. At the same time, for the sake of productivity in programming and execution, it is desirable that users be shielded from such error-prone, tedious details as those of communication and synchronization.

Posted in: Briefs, TSP, Information Sciences

Read More >>

Monitoring by Use of Clusters of Sensor-Data Vectors

Incoming data vectors are compared with clustered vectors representative of normal operation. The inductive monitoring system (IMS) is a system of computer hardware and software for automated monitoring of the performance, operational condition, physical integrity, and other aspects of the “health” of a complex engineering system (e.g., an industrial process line or a spacecraft). The input to the IMS consists of streams of digitized readings from sensors in the monitored system. The IMS determines the type and amount of any deviation of the monitored system from a nominal or normal (“healthy”) condition on the basis of a comparison between (1) vectors constructed from the incoming sensor data and (2) corresponding vectors in a database of nominal or normal behavior. The term “inductive” reflects the use of a process reminiscent of traditional mathematical induction to “learn” about normal operation and build the nominal-condition database. The IMS offers two major advantages over prior computational monitoring systems: The computational burden of the IMS is significantly smaller, and there is no need for abnormal-condition sensor data for training the IMS to recognize abnormal conditions.

Posted in: Briefs, TSP, Information Sciences

Read More >>

Processing Satellite Imagery To Detect Waste Tire Piles

Less time is needed for searching for previously unidentified piles. A methodology for processing commercially available satellite spectral imagery has been developed to enable identification and mapping of waste tire piles in California. The California Integrated Waste Management Board initiated the project and provided funding for the method’s development. The methodology includes the use of a combination of previously commercially available image-processing and georeferencing software used to develop a model that specifically distinguishes between tire piles and other objects. The methodology reduces the time that must be spent to initially survey a region for tire sites, thereby increasing inspectors’ and managers’ time available for remediation of the sites. Remediation is needed because millions of used tires are discarded every year, waste tire piles pose fire hazards, and mosquitoes often breed in water trapped in tires. It should be possible to adapt the methodology to regions outside California by modifying some of the algorithms implemented in the software to account for geographic differences in spectral characteristics associated with terrain and climate.

Posted in: Briefs, TSP, Information Sciences

Read More >>

PPC750 Performance Monitor

The PPC750 Performance Monitor (Perfmon) is a computer program that helps the user to assess the performance characteristics of application programs running under the Wind River VxWorks real-time operating system on a PPC750 computer. Perfmon generates a userfriendly interface and collects performance data by use of performance registers provided by the PPC750 architecture. It processes and presents run-time statistics on a per-task basis over a repeating time interval (typically, several seconds or minutes) specified by the user.

Posted in: Briefs, TSP, Information Sciences

Read More >>

Application-Program-Installer Builder

A computer program builds application programming interfaces (APIs) and related software components for installing and uninstalling application programs in any of a variety of computers and operating systems that support the Java programming language in its binary form. This program is partly similar in function to commercial (e.g., InstallShield) software. This program is intended to enable satisfaction of a quasi-industry-standard set of requirements for a set of APIs that would enable such installation and uninstallation and that would avoid the pitfalls that are commonly encountered during installation of software. The requirements include the following:

Posted in: Briefs, TSP, Information Sciences

Read More >>

The U.S. Government does not endorse any commercial product, process, or activity identified on this web site.