Special Coverage

Technique Provides Security for Multi-Robot Systems
Bringing New Vision to Laser Material Processing Systems
NASA Tests Lasers’ Ability to Transmit Data from Space
Converting from Hydraulic Cylinders to Electric Actuators
Automating Optimization and Design Tasks Across Disciplines
Vibration Tables Shake Up Aerospace and Car Testing
Supercomputer Cooling System Uses Refrigerant to Replace Water
Computer Chips Calculate and Store in an Integrated Unit
Electron-to-Photon Communication for Quantum Computing

An Empirical Metric of Individual Datapoint Utility Given Ample Metadata as Applied to the OCO-2 Flight System

This method constructs new warn levels for metadata-rich data sources.

NASA’s Jet Propulsion Laboratory, Pasadena, California

Traditionally, quality flags provided a binary yes/no estimation of a datapoint’s utility. However, in modern instrumentation, significant auxiliary information for each datapoint can be obtained. This permits prediction of more than a binary estimate of good or bad data. Further, the physical confounding forces that obscure an observation’s utility are themselves rarely binary, such as the example of clouds with varying thickness from insignificant to entirely opaque. In this method, many different increasingly stringent filters are created allowing less and less data through, while attempting to minimize an error metric. This metric can be compared with select “truth” systems such as ground observations or regions of the Earth where the truth is believed to be predictable and known. For each sounding, the number of these filters that reject the observation in question becomes an estimate of its data quality: larger values mean most filters reject the sounding, while smaller values mean most filters accept the sounding. This integer, ranging from 0 to 19, is called the Warn Level. Instead of a binary yes/no data quality flag, this instead provides a data ordering paradigm with “better” and “worse” data. Warn Levels can be developed for any metadata-rich datasource with a functional error metric to help guide researchers to superior, tunable data filtration.

Posted in: Briefs, TSP, Electronics & Computers, Information Sciences, Software, Computer software and hardware, Data management
Read More >>

Gravitational Compensation Onboard a Comsat

NASA’s Jet Propulsion Laboratory, Pasadena, California

This technique for compensating the gravitational attraction experienced by a test-mass freely floating onboard a satellite is new, and solves an important problem that all gravitational wave missions face. Its application to the geostationary Laser Interferometer Space Antenna (gLISA) mission concept addresses and completely solves an important noise source: the gravity-gradient noise.

Posted in: Briefs, TSP, Electronics & Computers, Information Sciences, Software, Satellites
Read More >>

AMMOS-PDS Pipeline Service (APPS) — Label Design Tool (LDT)

NASA’s Jet Propulsion Laboratory, Pasadena, California

A software program builds PDS4 science product label (metadata) and automatically generates its description as part of the software interface specification (SIS) document. This software allows the mission system engineer to interact programmatically with the PDS4 information model, and retrieve science product metadata information via graphical user interfaces (GUIs). This capability will greatly improve the processes of creating and generating software interface specification documents for science instruments. Given that PDS4 is a newly defined standard, most of the work that is simplified by this software suite is being done manually. This improvement allows the definition and design of PDS4 science data archive models for generating PDS4 compliant labels.

Posted in: Briefs, TSP, Electronics & Computers, Information Sciences, Software, Computer software and hardware, Data acquisition, Data acquisition (obsolete), Test equipment and instrumentation
Read More >>

Ontological System for Context Artifacts and Resources (OSCAR)

NASA’s Jet Propulsion Laboratory, Pasadena, California

Current data systems catalog and link data using a synthetic modeling approach that requires much domain knowledge in order to interact with the system. Domain knowledge includes what keyword to look for and how data artifacts are linked. OSCAR offers a semantic solution to data management by using ontology and reasoning. Information is automatically linked according to its internal ontology. An internal ontological reasoning engine handles information inference. Artifacts are linked by information mined from the input metadata and reasoned according to the internal ontology.

Posted in: Briefs, TSP, Electronics & Computers, Information Sciences, Software, Simulation and modeling, Data management
Read More >>

SPSCGR

NASA’s Jet Propulsion Laboratory, Pasadena, California

SPSCGR generates a contact graph suitable for use by the ION (Interplanetary Overlay Network) DTN (Delay/Disruption Tolerant Network) implementation from data provided by the JPL SPS (Service Preparation System) Portal. Prior to SPSCGR, there was no way for a mission or other entity to route DTN traffic across the DSN without manually constructing a contact graph. SPSCGR automates this process of contact graph construction.

Posted in: Briefs, TSP, Electronics & Computers, Information Sciences, Software, Computer software and hardware, Data management
Read More >>

Software Framework for Control and Observation in Distributed Environments (CODE)

Ames Research Center, Moffett Field, California

CODE is a framework for control and observation in distributed environments. The framework enables the observation of resources (computer systems, storage systems, networks, and so on), services (database servers, application execution, servers, file transfer servers, and so on), and applications. Further, the framework provides support for the secure and scalable transmission of this observed information to programs that are interested in it. The framework also supports the secure execution of actions on remote computer systems so that a management program can respond to the observed data that it receives. To assist in writing management programs, the framework interfaces to an existing expert system so that a user can define a set of rules for the expert system to reason on, instead of writing a large amount of code. The framework is modular and can be easily extended to incorporate new sensors to make observations, new actuators to perform actions, new communication protocols, and new security mechanisms. The software also includes several applications that show how the framework can be used.

Posted in: Briefs, Electronics & Computers, Software, Communication protocols, Computer software and hardware
Read More >>

Simple RunTime eXecutive (SRTX)

Marshall Space Flight Center, Alabama

Simple RunTime eXecutive (SRTX) software provides scheduling and publish/subscribe data transfer services. The scheduler allows dynamic allocation of real-time periodic and asynchronous tasks across homogeneous multi core/multiprocessor systems. Most real-time systems assign tasks to specific cores on an a priori basis. Allowing the operating system scheduler to determine the best allocation of threads is not a unique innovation. However, it is coupled with a deterministic publish/subscribe data transfer system that guarantees the tasks process data deterministically, regardless of the number of processor cores in the system.

Posted in: Briefs, Electronics & Computers, Software, Computer software and hardware, Data management
Read More >>

v-Anomica: A Fast Support Vector-Based Novelty Detection Technique

Ames Research Center, Moffett Field, California

Outlier or anomaly detection refers to the task of identifying abnormal or inconsistent patterns from a dataset. While they may seem to be undesirable entities, identifying them has many potential applications in fraud and intrusion detection, medical research, and safety-critical vehicle health management. Outliers can be detected using supervised, semi-supervised, or unsupervised techniques. Unsupervised techniques do not require labeled instances for detecting outliers. Supervised techniques require labeled instances of both normal and abnormal operation data for first building a model (e.g., a classifier), and then testing if an unknown data point is a normal one or an outlier. The model can be probabilistic such as Bayesian inference or deterministic such as decision trees, Support Vector Machines (SVMs), and neural networks. Semi-supervised techniques only require labeled instances of normal data. Hence, they are more widely applicable than the fully supervised ones. These techniques build models of normal data and then flag outliers that do not fit the model.

Posted in: Briefs, Electronics & Computers, Software, Analysis methodologies, Safety critical systems, Vehicle health management, Data management
Read More >>

Self-Stabilizing Distributed Clock Synch ronization Protocol for Arbitrary Digraphs

Langley Research Center, Hampton, Virginia

A report describes a self-stabilizing distributed clock synchronization protocol in the absence of faults in the system. It is focused on the distributed clock synchronization of an arbitrary, non-partitioned digraph ranging from fully connected to 1-connected networks of nodes, while allowing for differences in the network elements. The protocol does not rely on assumptions about the initial state of the system, other than the presence of at least one node, and no central clock or a centrally generated signal, pulse, or message is used.

Posted in: Briefs, Electronics & Computers, Software, Simulation and modeling, Communication protocols
Read More >>

Precision Navigation Strategies for Primitive Solar-System-Body Sample Return Missions

Goddard Space Flight Center, Greenbelt, Maryland

This project investigated advanced navigation strategies required to approach, perform proximity operations, and return a sample from an asteroid or comet. An optimized navigation strategy for a notional mission to a near-Earth asteroid was developed to serve as a baseline for future missions and mission proposals. Essential simulation and analysis software enhancements were developed and implemented in the Orbit De ter mination Toolbox (ODTBX), an open-source, early mission navigation analysis tool suite built on a flexible architecture. The development efforts of this project resulted in the first fully open-source tool suite with the capabilities of performing primitive body navigation simulation and analyses.

Posted in: Briefs, TSP, Electronics & Computers, Information Sciences, Software, Computer software and hardware, Spacecraft guidance
Read More >>

The U.S. Government does not endorse any commercial product, process, or activity identified on this web site.