Home

Architecture for an Intermediate-Frequency Digital Downconversion and Data Distribution Network

Developed originally for Deep Space Network downlink receivers, applications include high-speed digital receivers for cellular networks. NASA’s Jet Propulsion Laboratory, Pasadena, California NASA’s Deep Space Network (DSN) is looking to modernize aging downlink receivers for telemetry, tracking, and radio science. It is looking to replace multiple types of custom-built, special-purpose receivers with a unified receiver architecture that can support the various downlink data types. As part of this modernization, it is desired to only digitize the data once and then distribute the data using commercial switching network technology to multiple back-end receiver processing hardware and software. The main problem to be solved is how to distribute efficiently and flexibly high-bandwidth intermediate-frequency (100 to 600 MHz) digitized signals across a signal processing center for use in the DSN.

Posted in: Briefs, Electronics

Read More >>

SEA5: Space Environment Automated Alerts & Anomaly Analysis Assistant

Goddard Space Flight Center, Greenbelt, Maryland The Community Coordinated Modeling Center (CCMC) provides a wide range of space weather tools and services for the general scientific community. One such product that facilitates space weather situational awareness is collectively known as the Integrated Space Weather Analysis (ISWA) System. Using the ISWA system and other tools, space weather forecasters are able to assess the space environment in both real time and for historical cases — both of which help mitigate potential space weather impacts on missions, as well as assist in spacecraft anomaly resolution. The Space Environment Automated Alerts & Anomaly Analysis Assistant (SEA5) will provide past, present, and predicted space environment information for specific missions, orbits, and user-specified locations throughout the heliosphere, geospace, and on the ground.

Posted in: Briefs, Data Acquisition

Read More >>

Extensible Data Gateway Environment (EDGE)

NASA’s Jet Propulsion Laboratory, Pasadena, California The NASA Physical Oceanography Distributed Active Archive Center (PO.DAAC) is NASA’s designated data center for information relevant to the physical state of the ocean. Its core datamanagement and workflow system, Data Management and Archive System (DMAS), is responsible for processing hundreds of thousands of data products each day, around the clock. Its inventory captures over 800 datasets, several million granules, and millions of files. PO.DAAC is in need of a solution to help users quickly identify the relevant oceanographic data artifact. It also needs to export metadata according to the ISO-19115, FGDC, and GCMD specifications. Developing such a solution on top of its Oracle database has several issues. First, it is difficult to maintain since SQL needs to be updated when a schema changes or when new search criteria is needed. Second, multi-table joins yield poor performance. Third, query performance can be improved with additional indexes, but performance is negatively impacted on updates. Fourth, exposing the operational database as the direct backend to a publicly accessible service layer would subject Oracle to a Denial of Service (DoS) attack, which could halt the already very busy DMAS operation environment.

Posted in: Briefs, Data Acquisition

Read More >>

Flight Test Maneuvers for Efficient Aerodynamic Modeling

Langley Research Center, Hampton, Virginia Flight testing is expensive. It is therefore important that necessary flight data be collected in the most efficient manner possible. Inputs traditionally used for flight test maneuvers to collect aircraft stability and control data include doublets, impulses (stick raps), multisteps, and frequency sweeps. All of these input types are designed for single-axis response, although often the inputs are applied sequentially to different controls to collect multi-axis data.

Posted in: Briefs, Data Acquisition

Read More >>

JPL CO2 Virtual Science Data Environment (VSDE)

NASA’s Jet Propulsion Laboratory, Pasadena, California The JPL CO2 Virtual Science Data Environment (VSDE) (http://co2.jpl.nasa.gov) is a comprehensive effort to bring together the models, data, and tools necessary for atmospheric CO2 research. The VSDE site is designed to provide streamlined Web-based discovery and access to multiple global and regional carbon dioxide data sets. Furthermore, this site provides tools for conversion, manipulation, and transformation of the data to facilitate research.

Posted in: Briefs, Data Acquisition

Read More >>

Python Advanced Microwave Precipitation Radiometer Data Toolkit (PyAMPR)

Marshall Space Flight Center, Alabama Advanced Microwave Precipitation Radiometer (AMPR) brightness temperature data from NASA field projects are in ASCII format. This Python script defines a class that will read in a single file from an individual aircraft flight and pull out timing, brightness temperatures from each channel, geolocation, and other information and store them as attributes using numpy arrays of the appropriate type.

Posted in: Briefs, Data Acquisition

Read More >>

Data Ordering Genetic Optimization (DOGO) System

Ordering data from most to least useful replaces quality flags, improves climate science results, prioritizes images for analysis, and guides analysts for optimal data filtration. NASA’s Jet Propulsion Laboratory, Pasadena, California Observations in modern datasets have a continuum of quality that can be hard to quantify. For example, satellite observations are subject to often-subtle mixtures of confounding forces that distort the observation’s utility to a varying extent. For the Orbiting Carbon Observatory-2 (OCO-2) observatory, effects such as cloud cover, aerosols in the atmosphere, and surface roughness are three major confounding forces that can mildly, heavily, or totally confound an observation’s utility. These complicating factors are not present in a binary fashion: clouds can cover a percentage of the scene, have variable opacity, and differing topology. Arbitrary thresholds are traditionally placed on the presence of such forces to yield a binary good/bad data flag for each observation. By instead generating a data ordering, users are guided towards the most reliable data first, followed by increasingly challenging observations. No harsh on/off threshold is applied to the data, potentially obscuring useful data to one user while leaving in confounded observations to another. Allowing users to create custom filters based on DOGO’s data ordering leaves hard cutoff decisions in the hands of users, guided but not restricted by the project’s expert knowledge.

Posted in: Briefs, Data Acquisition

Read More >>

The U.S. Government does not endorse any commercial product, process, or activity identified on this web site.