### Topics

### features

### Publications

### Issue Archive

# Expectation-Based Control of Noise and Chaos

Wednesday, 01 November 2006

A proposed approach to control of noise and chaos in dynamic systems would supplement conventional methods. The approach is based on fictitious forces composed of expectations governed by Fokker-Planck or Liouville equations that describe the evolution of the probability densities of the controlled parameters. These forces would be utilized as feedback control forces that would suppress the undesired diffusion of the controlled parameters. Examples of dynamic systems in which the approach is expected to prove beneficial include spacecraft, electronic systems, and coupled lasers.

Read More >>

Posted in: Briefs, TSP

Read More >>

# Equations for Scoring Rules When Data Are Missing

Wednesday, 01 November 2006

A document presents equations for scoring rules in a diagnostic and/or prognostic artificial-intelligence software system of the rule-based inference- engine type. The equations define a set of metrics that characterize the evaluation of a rule when data required for the antecedence clause(s) of the rule are missing. The metrics include a primary measure denoted the rule completeness metric (RCM) plus a number of subsidiary measures that contribute to the RCM. The RCM is derived from an analysis of a rule with respect to its truth and a measure of the completeness of its input data. The derivation is such that the truth value of an antecedent is independent of the measure of its completeness. The RCM can be used to compare the degree of completeness of two or more rules with respect to a given set of data. Hence, the RCM can be used as a guide to choosing among rules during the rule-selection phase of operation of the artificial-intelligence system.

Read More >>

Posted in: Briefs, TSP

Read More >>

# Algorithm for Automated Detection of Edges of Clouds

Sunday, 01 October 2006

The algorithm has been shown to be reliable and robust.
An algorithm processes cloud-physics data gathered in situ by an aircraft, along with reflectivity data gathered by ground-based radar, to determine whether the aircraft is inside or outside a cloud at a given time. A cloud edge is deemed to be detected when the in/out state changes, subject to a hysteresis constraint. Such determinations are important in continuing research on relationships among lightning, electric charges in clouds, and decay of electric fields with distance from cloud edges.

Read More >>

Posted in: Briefs, TSP

Read More >>

# Exploiting Quantum Resonance to Solve Combinatorial Problems

Sunday, 01 October 2006

Quantum resonance would be exploited in a proposed quantum-computing approach to the solution of combinatorial optimization problems. In quantum computing in general, one takes advantage of the fact that an algorithm cannot be decoupled from the physical effects available to implement it. Prior approaches to quantum computing have involved exploitation of only a subset of known quantum physical effects, notably including parallelism and entanglement, but not including resonance. In the proposed approach, one would utilize the combinatorial properties of tensor-product decomposability of unitary evolution of many-particle quantum systems for physically simulating solutions to NP-complete problems (a class of problems that are intractable with respect to classical methods of computation). In this approach, reinforcement and selection of a desired solution would be executed by means of quantum resonance. Classes of NP-complete problems that are important in practice and could be solved by the proposed approach include planning, scheduling, search, and optimal design.

Read More >>

Posted in: Briefs, TSP

Read More >>

# A Concept for Run-Time Support of the Chapel Language

Sunday, 01 October 2006

A document presents a concept for run-time implementation of other concepts embodied in the Chapel programming language. (Now undergoing development, Chapel is intended to become a standard language for parallel computing that would surpass older such languages in both computational performance in the efficiency with which pre-existing code can be reused and new code written.) The aforementioned other concepts are those of distributions, domains, allocations, and access, as defined in a separate document called “A Semantic Framework for Domains and Distributions in Chapel” and linked to a language specification defined in another separate document called “Chapel Specification 0.3.” The concept presented in the instant report is recognition that a data domain that was invented for Chapel offers a novel approach to distributing and processing data in a massively parallel environment. The concept is offered as a starting point for development of working descriptions of functions and data structures that would be necessary to implement interfaces to a compiler for transforming the aforementioned other concepts from their representations in Chapel source code to their run-time implementations.

Read More >>

Posted in: Briefs, TSP

Read More >>

# Solving the Swath Segment Selection Problem

Friday, 01 September 2006

Several techniques for solving the problem have been tested and compared.
Several artificial-intelligence search techniques have been tested as means of solving the swath segment selection problem (SSSP) ? a real-world problem that is not only of interest in its own right, but is also useful as a test bed for search techniques in general. In simplest terms, the SSSP is the problem of scheduling the observation times of an airborne or spaceborne synthetic aperture radar (SAR) system to effect the maximum coverage of a specified area (denoted the target), given a schedule of downlinks (opportunities for radio transmission of SAR scan data to a ground station), given the limit on the quantity of SAR scan data that can be stored in an onboard memory between downlink opportunities, and given the limit on the achievable downlink data rate. The SSSP is NP complete (short for “nondeterministic polynomial time complete” ? characteristic of a class of intractable problems that can be solved only by use of computers capable of making guesses and then checking the guesses in polynomial time).

Read More >>

Posted in: Briefs, TSP

Read More >>

# The Spatial Standard Observer

Friday, 01 September 2006

Degrees of visibility and discriminability of targets in images can be estimated.
The spatial standard observer is a computational model that provides a measure of the visibility of a target in a uniform background image or of the visual discriminability of two images. Standard observers have long been used in science and industry to quantify the discriminability of colors. Color standard observers address the spectral characteristics of visual stimuli, while the spatial standard observer (SSO), as its name indicates, addresses spatial characteristics. The SSO is based on a model of human vision. The SSO was developed in a process that included evaluation of a number of earlier mathematical models that address optical, physiological, and psychophysical aspects of spatial characteristics of human visual perception. Elements of the prior models are incorporated into the SSO, which is formulated as a compromise between accuracy and simplicity. The SSO operates on a digitized monochrome still image or on a pair of such images. The SSO consists of three submodels that operate sequentially on the input image(s):

Read More >>

Posted in: Briefs

Read More >>