Special Coverage

Home

NASA Data Acquisition System (NDAS)

A software application is intended to be adaptable to any propulsion test stand or facility’s data acquisition system. Stennis Space Center, Mississippi The test complexes at John C. Stennis Space Center (SSC) require reliable and accurate data acquisition in order to analyze the results of rocket engine tests. Acquisition systems include high-speed data, low-speed data, event monitoring, and video feeds. In order to obtain accurate data, routine calibrations must be performed on each channel, which can be defined as a single data stream to be collected, including the entire hardware chain from signal acquisition by a transducer to signal conditioning by an amplifier and digitization by an analog-to-digital converter.

Posted in: Briefs, Electronics & Computers

Read More >>

Integrated Genomic and Proteomic Information Security Protocol

A security protocol requires a cryptanalysis infrastructure not available to most attackers. Goddard Space Flight Center, Greenbelt, Maryland The motivation for this research is the fact that, for a variety of reasons, networks and their existing authentication and confidentiality infrastructure are becoming more vulnerable to attack. The protocols in this research are based upon a security architecture that relies upon codes derived from the processes that regulate gene expression. In vivo, these processes control and regulate transcription of DNA into various forms of RNA, translation of messenger RNA into proteins, and a variety of other pre-and post-transcriptional and translational regulatory processes. They utilize networks of protein and nucleic acid complexes. Through use of information theory, the processes of regulation of gene expression are being adapted to network and information security. The approach can be used in conjunction with legacy security architectures, algorithms, and processes as well as Mobile Ad-hoc Networks (MANET).

Posted in: Briefs, TSP, Electronics & Computers

Read More >>

Flight Processor Virtualization for Size, Weight, and Power Reduction

A flight software system that was originally deployed on six separate physical processors is modeled using a single processor. Goddard Space Flight Center, Greenbelt, Maryland This work demonstrated the cost-saving and fault-tolerant benefits of virtualization technology by consolidating the flight software from multiple flight processors into a single virtualized system. In this study, a flight software system that was originally deployed on six separate physical processors was modeled using a single processor and a real-time embedded hypervisor.

Posted in: Briefs, TSP, Electronics & Computers

Read More >>

Massively Parallel Dantzig-Wolfe Decomposition Applied to Traffic Flow Scheduling

Future decision support tools may make use of the model with commercial-off-the-shelf software and hardware. Ames Research Center, Moffett Field, California Traffic flow management (TFM) of the National Airspace System (NAS) endeavors to deliver flights from their origins to their destinations while minimizing delays and respecting all capacities. There are several models for solving this problem. Some models aggregate flights into flows and others consider controls for individual flights. Typically, the latter set of models is computationally difficult to solve for large-scale, high-fidelity scenarios. One of the more heavily studied aircraft-level models presented by Bertsimas and Stock-Patterson (BSP) has runtime concerns that should not be overlooked, but it neatly describes the issues associated with TFM (respecting capacities of airspace resources and the schedules of individual aircraft).

Posted in: Briefs, Electronics & Computers

Read More >>

Mission Control Technologies (MCT)

Ames Research Center, Moffett Field, California MCT enables users to compose software from objects that can be assembled by end users to create integrated functionality. Applications are eliminated in favor of compositions of “live objects” that can be combined in different ways for different users and missions as required, in contrast to the more traditional software development method of pre-determining functionality and building a monolithic application.

Posted in: Briefs, Electronics & Computers

Read More >>

Interface Validation for Distributed Software Systems

Goddard Space Flight Center, Greenbelt, Maryland As a result of performing IV&V (Independent Verification and Validation) on Space Station software, a number of interface faults were found during integrated testing or actual software deployment. Faults found at this late phase of the software development lifecycle are very expensive to fix. Other research indicates that significant cost savings can be realized if these types of faults can be discovered at earlier software development lifecycle phases, such as specification or coding. A need was determined for processes, procedures, and tools that will reliably identify interface faults during these earlier software development lifecycle phases. The ability to perform interface validation during earlier phases will reduce costly fixes due to interface faults discovered during later software development phases.

Posted in: Briefs, TSP, Electronics & Computers

Read More >>

Simple, Scalable, Script-Based Science Processing Archive

The system provides data access control, data subscription, metadata publication, and data recovery. Goddard Space Flight Center, Greenbelt, Maryland Simple, Scalable, Script-based, Science Processing (S4P) Archive (S4PA) is a disk-based data-archiving system for remote sensing data. It is based on the data-driven framework of S4P. The system is used for new data transfer, data preprocessing, metadata generation, and data archival. The system provides services such as data access control, data subscription, metadata publication, and data recovery. The data is archived on readily available disk drives, with FTP (File Transfer Protocol) and HTTP (Hypertext Transfer Protocol) being primary modes of data access. S4PA includes a graphical user interface for monitoring and re-configuring the system operation, a tool for deploying the system, and various other tools that help manage the data ingest and archiving process, such as data replication, auxiliary file backup, database merge, storage of dataset README documents in CVS (Concurrent Versions System), an interface for machine search, deployment of S4PA instances from configuration stored in CVS, etc.

Posted in: Briefs, TSP, Electronics & Computers

Read More >>