Home

Algorithm for Automated Detection of Edges of Clouds

The algorithm has been shown to be reliable and robust. An algorithm processes cloud-physics data gathered in situ by an aircraft, along with reflectivity data gathered by ground-based radar, to determine whether the aircraft is inside or outside a cloud at a given time. A cloud edge is deemed to be detected when the in/out state changes, subject to a hysteresis constraint. Such determinations are important in continuing research on relationships among lightning, electric charges in clouds, and decay of electric fields with distance from cloud edges.

Posted in: Briefs, TSP

Read More >>

Exploiting Quantum Resonance to Solve Combinatorial Problems

Quantum resonance would be exploited in a proposed quantum-computing approach to the solution of combinatorial optimization problems. In quantum computing in general, one takes advantage of the fact that an algorithm cannot be decoupled from the physical effects available to implement it. Prior approaches to quantum computing have involved exploitation of only a subset of known quantum physical effects, notably including parallelism and entanglement, but not including resonance. In the proposed approach, one would utilize the combinatorial properties of tensor-product decomposability of unitary evolution of many-particle quantum systems for physically simulating solutions to NP-complete problems (a class of problems that are intractable with respect to classical methods of computation). In this approach, reinforcement and selection of a desired solution would be executed by means of quantum resonance. Classes of NP-complete problems that are important in practice and could be solved by the proposed approach include planning, scheduling, search, and optimal design.

Posted in: Briefs, TSP

Read More >>

A Concept for Run-Time Support of the Chapel Language

A document presents a concept for run-time implementation of other concepts embodied in the Chapel programming language. (Now undergoing development, Chapel is intended to become a standard language for parallel computing that would surpass older such languages in both computational performance in the efficiency with which pre-existing code can be reused and new code written.) The aforementioned other concepts are those of distributions, domains, allocations, and access, as defined in a separate document called “A Semantic Framework for Domains and Distributions in Chapel” and linked to a language specification defined in another separate document called “Chapel Specification 0.3.” The concept presented in the instant report is recognition that a data domain that was invented for Chapel offers a novel approach to distributing and processing data in a massively parallel environment. The concept is offered as a starting point for development of working descriptions of functions and data structures that would be necessary to implement interfaces to a compiler for transforming the aforementioned other concepts from their representations in Chapel source code to their run-time implementations.

Posted in: Briefs, TSP

Read More >>

Solving the Swath Segment Selection Problem

Several techniques for solving the problem have been tested and compared. Several artificial-intelligence search techniques have been tested as means of solving the swath segment selection problem (SSSP) ? a real-world problem that is not only of interest in its own right, but is also useful as a test bed for search techniques in general. In simplest terms, the SSSP is the problem of scheduling the observation times of an airborne or spaceborne synthetic aperture radar (SAR) system to effect the maximum coverage of a specified area (denoted the target), given a schedule of downlinks (opportunities for radio transmission of SAR scan data to a ground station), given the limit on the quantity of SAR scan data that can be stored in an onboard memory between downlink opportunities, and given the limit on the achievable downlink data rate. The SSSP is NP complete (short for “nondeterministic polynomial time complete” ? characteristic of a class of intractable problems that can be solved only by use of computers capable of making guesses and then checking the guesses in polynomial time).

Posted in: Briefs, TSP

Read More >>

The Spatial Standard Observer

Degrees of visibility and discriminability of targets in images can be estimated. The spatial standard observer is a computational model that provides a measure of the visibility of a target in a uniform background image or of the visual discriminability of two images. Standard observers have long been used in science and industry to quantify the discriminability of colors. Color standard observers address the spectral characteristics of visual stimuli, while the spatial standard observer (SSO), as its name indicates, addresses spatial characteristics. The SSO is based on a model of human vision. The SSO was developed in a process that included evaluation of a number of earlier mathematical models that address optical, physiological, and psychophysical aspects of spatial characteristics of human visual perception. Elements of the prior models are incorporated into the SSO, which is formulated as a compromise between accuracy and simplicity. The SSO operates on a digitized monochrome still image or on a pair of such images. The SSO consists of three submodels that operate sequentially on the input image(s):

Posted in: Briefs

Read More >>

Less-Complex Method of Classifying MPSK

Nearly optimal performance can be obtained with less computation. An alternative to an optimal method of automated classification of signals modulated with M-ary phase-shift-keying (M-ary PSK or MPSK) has been derived. The alternative method is approximate, but it offers nearly optimal performance and entails much less complexity, which translates to much less computation time.

Posted in: Briefs, TSP

Read More >>

Assistant for Analyzing Tropical Rain Mapping Radar Data

A document is defined that describes an approach for a Tropical Rain Mapping Radar Data System (TDS). TDS is composed of software and hardware elements incorporating a two-frequency spaceborne radar system for measuring tropical precipitation. The TDS would be used primarily in generating data products for scientific investigations. The most novel part of the TDS would be expert-system software to aid in the selection of algorithms for converting raw radar-return data into such primary observables as rain rate, path integrated rain rate, and surface backscatter. The expert-system approach would address the issue that selection of algorithms for processing the data requires a significant amount of preprocessing, non-intuitive reasoning, and heuristic application, making it infeasible, in many cases, to select the proper algorithm in real time. In the TDS, tentative selections would be made to enable conversions in real time. The expert system would remove straightforwardly convertible data from further consideration, and would examine ambiguous data, performing analysis in depth to determine which algorithms to select. Conversions performed by these algorithms, presumed to be correct, would be compared with the corresponding real-time conversions. Incorrect real-time conversions would be updated using the correct conversions.

Posted in: Briefs, TSP

Read More >>

The U.S. Government does not endorse any commercial product, process, or activity identified on this web site.