Information Technology & Software

Integrated Risk and Knowledge Management Program — IRKM-P

Program helps people do work more effectively.

The NASA Exploration Systems Mission Directorate (ESMD) IRKM-P tightly couples risk management and knowledge management processes and tools to produce an effective “modern” work environment. IRKM-P objectives include: (1) to learn lessons from past and current programs (Apollo, Space Shuttle, and the International Space Station); (2) to generate and share new engineering design, operations, and management best practices through pre-existing Continuous Risk Management (CRM) procedures and knowledge-management practices; and (3) to infuse those lessons and best practices into current activities. The conceptual framework of the IRKM-P is based on the assumption that risks highlight potential knowledge gaps that might be mitigated through one or more knowledge management practices or artifacts. These same risks also serve as cues for collection of knowledge — particularly, knowledge of technical or programmatic challenges that might recur.

Posted in: Briefs, TSP, Information Sciences, Data management, Risk management, Technical review
Read More >>

Constructing LDPC Codes From Loop-Free Encoding Modules

High-speed iterative decoders can readily be implemented in hardware.

A method of constructing certain low-density parity-check (LDPC) codes by use of relatively simple loop-free coding modules has been developed. The subclasses of LDPC codes to which the method applies includes accumulate-repeat-accumulate (ARA) codes, accumulate-repeat-check-accumulate codes, and the codes described in “Accumulate-Repeat-Accumulate- Accumulate Codes” (NPO-41305), NASA Tech Briefs, Vol. 31, No. 9 (September 2007), page 90. All of the affected codes can be characterized as serial/parallel (hybrid) concatenations of such relatively simple modules as accumulators, repetition codes, differentiators, and punctured single-parity check codes. These are error-correcting codes suitable for use in a variety of wireless data-communication systems that include noisy channels. These codes can also be characterized as hybrid turbolike codes that have projected graph or protograph representations (for example see figure); these characteristics make it possible to design high-speed iterative decoders that utilize belief-propagation algorithms.

Posted in: Briefs, Information Sciences
Read More >>

LDPC Codes With Minimum Distance Proportional to Block Size

These codes offer both low decoding thresholds and low error floors.

Low-density parity-check (LDPC) codes characterized by minimum Hamming distances proportional to block sizes have been demonstrated. Like the codes mentioned in the immediately preceding article, the present codes are error-correcting codes suitable for use in a variety of wireless data-communication systems that include noisy channels.

Posted in: Briefs, Information Sciences
Read More >>

Extending Newtonian Dynamics to Include Stochastic Processes

A paper presents further results of continuing research reported in several previous NASA Tech Briefs articles, the two most recent being “Stochastic Representations of Chaos Using Terminal Attractors” (NPO-41519), [Vol. 30, No. 5 (May 2006), page 57] and “Physical Principle for Generation of Randomness” (NPO-43822) [Vol. 33, No. 5 (May 2009), page 56]. This research focuses upon a mathematical formalism for describing postinstability motions of a dynamical system characterized by exponential divergences of trajectories leading to chaos (including turbulence as a form of chaos).

Posted in: Briefs, TSP, Information Sciences, Mathematical models, Aerodynamics, Turbulence
Read More >>

Rate-Compatible LDPC Codes With Linear Minimum Distance

These protograph-based codes can have fixed input or output block sizes.

A recently developed method of constructing protograph-based low-density parity-check (LDPC) codes provides for low iterative decoding thresholds and minimum distances proportional to block sizes, and can be used for various code rates. A code constructed by this method can have either fixed input block size or fixed output block size and, in either case, provides rate compatibility.

Posted in: Briefs, Information Sciences
Read More >>

Minimizing Input-to-Output Latency in Virtual Environment

A method and apparatus were developed to minimize latency (time delay) in virtual environment (VE) and other discrete-time computer-based systems that require real-time display in response to sensor inputs. Latency in such systems is due to the sum of the finite time required for information processing and communication within and between sensors, software, and displays. Even though the latencies intrinsic to each individual hardware, software, and communication component can be minimized (or theoretically eliminated) by speeding up internal computation and transmission speeds, time delays due to the integration of the overall system will persist. These “integration” delays arise when data produced or processed by earlier components or stages in a system pathway sit idle, waiting to be accessed by subsequent components. Such idle times can be sizeable when compared with latency of individual system components and can also be variable in duration because of insufficient synchrony between events in the data path. This development is intended specifically to reduce the magnitude and variability of idle-time type delays and thus enable the minimization and stabilization of overall latency in the complete VE (or other computer) system.

Posted in: Briefs, TSP, Information Sciences, Virtual reality, Computer software and hardware, Displays, Data management
Read More >>

PrimeSupplier Cross-Program Impact Analysis and Supplier Stability Indicator Simulation Model

This application has potential uses in supply-chain and enterprise-resource planning software.

PrimeSupplier, a supplier cross-program and element-impact simulation model, with supplier solvency indicator (SSI), has been developed so that the shuttle program can see early indicators of supplier and product line stability, while identifying the various elements and/or programs that have a particular supplier or product designed into the system. The model calculates two categories of benchmarks to determine the SSI, with one category focusing on agency programmatic data and the other focusing on a supplier’s financial liquidity.

Posted in: Briefs, Information Sciences, Computer simulation, Financial management, Logistics, Supplier assessment
Read More >>

Integrated Planning for Telepresence With Time Delays

An artificial-intelligence assistant helps a human supervisor control a distant robot.

A conceptual “intelligent assistant” and an artificial-intelligence computer program that implements the intelligent assistant have been developed to improve control exerted by a human supervisor over a robot that is so distant that communication between the human and the robot involves significant signal-propagation delays. The goal of the effort is not only to help the human supervisor monitor and control the state of the robot, but also to improve the efficiency of the robot by allowing the supervisor to “work ahead.” The intelligent assistant is an integrated combination of an artificial-intelligence planner and a monitor of states of both the human supervisor and the remote robot. The novelty of the system lies in the way it uses the planner to reason about the states at both ends of the time delay.

Posted in: Briefs, Information Sciences, Artificial intelligence, Computer software and hardware, Human machine interface (HMI), Robotics
Read More >>

Gaussian and Lognormal Models of Hurricane Gust Factors

A document describes a tool that predicts the likelihood of land-falling tropical storms and hurricanes exceeding specified peak speeds, given the mean wind speed at various heights of up to 500 feet (150 meters) above ground level. Empirical models to calculate mean and standard deviation of the gust factor as a function of height and mean wind speed were developed in Excel based on data from previous hurricanes. Separate models were developed for Gaussian and offset lognormal distributions for the gust factor. Rather than forecasting a single, specific peak wind speed, this tool provides a probability of exceeding a specified value. This probability is provided as a function of height, allowing it to be applied at a height appropriate for tall structures.

Posted in: Briefs, TSP, Information Sciences, Mathematical models, Weather and climate
Read More >>

Aligning a Receiving Antenna Array To Reduce Interference

This arraying algorithm has potential utility in radio astronomy and radio communication.

A digital signal-processing algorithm has been devised as a means of aligning (as defined below) the outputs of multiple receiving radio antennas in a large array for the purpose of receiving a desired weak signal transmitted by a single distant source in the presence of an interfering signal that (1) originates at another source lying within the antenna beam and (2) occupies a frequency band significantly wider than that of the desired signal. In the original intended application of the algorithm, the desired weak signal is a spacecraft telemetry signal, the antennas are spacecraft-tracking antennas in NASA’s Deep Space Network, and the source of the wide-band interfering signal is typically a radio galaxy or a planet that lies along or near the line of sight to the spacecraft. The algorithm could also afford the ability to discriminate between desired narrow-band and nearby undesired wide-band sources in related applications that include satellite and terrestrial radio communications and radio astronomy.

Posted in: Briefs, Information Sciences, Mathematical models, Antennas
Read More >>

The U.S. Government does not endorse any commercial product, process, or activity identified on this web site.