Information Science

Evaluating Performance of Components

Parallel Component Performance Benchmarks is a computer program developed to aid the evaluation of the Common Component Architecture (CCA) — a software architecture, based on a component model, that was conceived to foster high-performance computing, including parallel computing. More specifically, this program compares the performances (principally by measuring computing times) of componentized versus conventional versions of the Parallel Pyramid 2D Adaptive Mesh Refinement library — a software library that is used to generate computational meshes for solving physical problems and that is typical of software libraries in use at NASA's Jet Propulsion Laboratory.

Posted in: Information Sciences, Briefs, TSP

Read More >>

Tools for Administration of a UNIX-Based Network

Several computer programs have been developed to enable efficient administration of a large, heterogeneous, UNIX-based computing and communication network that includes a variety of computers connected to a variety of sub-networks. One program provides secure software tools for administrators to create, modify, lock, and delete accounts of specific users. This program also provides tools for users to change their UNIX passwords and log-in shells. These tools check for errors. Another program comprises a client and a server component that, together, provide a secure mechanism to create, modify, and query quota levels on a network file system (NFS) mounted by use of the VERITAS File System software. The client software resides on an internal secure computer with a secure Web interface; one can gain access to the client software from any authorized computer capable of running web-browser software. The server software resides on a UNIX computer configured with the VERITAS software system. Directories where VERITAS quotas are applied are NFS-mounted. Another program is a Web-based, client/server Internet Protocol (IP) address tool that facilitates maintenance lookup of information about IP addresses for a network of computers.

Posted in: Information Sciences, Briefs

Read More >>

Collaborative Planning of Robotic Exploration

The Science Activity Planner (SAP) software system includes an uplink-planning component, which enables collaborative planning of activities to be undertaken by an exploratory robot on a remote planet or on Earth. Included in the uplink-planning component is the SAP-Uplink Browser, which enables users to load multiple spacecraft activity plans into a single window, compare them, and merge them. The uplink-planning component includes a subcomponent that implements the Rover Markup Language Activity Planning format (RML-AP), based on the Extensible Markup Language (XML) format that enables the representation, within a single document, of planned spacecraft and robotic activities together with the scientific reasons for the activities. Each such document is highly parseable and can be validated easily. Another subcomponent of the uplink-planning component is the Activity Dictionary Markup Language (ADML), which eliminates the need for two mission activity dictionaries — one in a human-readable format and one in a machine-readable format. Style sheets that have been developed along with the ADML format enable users to edit one dictionary in a user-friendly environment without compromising the machine-read-ability of the format.

Posted in: Information Sciences, Briefs, TSP

Read More >>

Framework for Development of Object-Oriented Software

The Real-Time Control (RTC) Application Framework is a high-level software framework written in C++ that supports the rapid design and implementation of object-oriented application programs. This framework provides built-in functionality that solves common software development problems within distributed client-server, multi-threaded, and embedded programming environments. When using the RTC Framework to develop software for a specific domain, designers and implementers can focus entirely on the details of the domain-specific software rather than on creating custom solutions, utilities, and frameworks for the complexities of the programming environment. The RTC Framework was originally developed as part of a Space Shuttle Launch Processing System (LPS) replacement project called Checkout and Launch Control System (CLCS). As a result of the framework's development, CLCS software development time was reduced by 66 percent. The framework is generic enough for developing applications outside of the launch-processing system domain. Other applicable high-level domains include command and control systems and simulation/training systems.

Posted in: Information Sciences, Briefs

Read More >>

Faster Processing for Inverting GPS Occultation Data

A document outlines a computational method that can be incorporated into two prior methods used to invert Global Positioning System (GPS) occultation data [signal data acquired by a low-Earth-orbiting satellite as either this or the GPS satellite rises above or falls below the horizon] to obtain information on altitude-dependent properties of the atmosphere. The two prior inversion methods, known as back propagation and canonical transform, are computationally expensive because for each occultation, they involve numerical evaluation of a large number of diffraction-like spatial integrals. The present method involves an angular-spectrum-based phase-extrapolation approximation in which each data point is associated with a plane-wave component that propagates in a unique direction from the orbit of the receiving satellite to intersect a straight line tangent to the orbit at a nearby point. This approximation enables the use of fast Fourier transforms (FFTs), which apply only to data collected along a straight-line trajectory. The computation of the diffraction-like integrals in the angular-spectrum domain by use of FFTs takes only seconds, whereas previously, it took minutes.

Posted in: Information Sciences, Briefs, TSP

Read More >>

Determining Sizes of Particles in a Flow From DPIV Data

The same equipment would be used to measure sizes as well as velocities. A proposed method of measuring the size of particles entrained in a flow of a liquid or gas would involve utilization of data from digital particle-image velocimetry (DPIV) of the flow. That is to say, with proper design and operation of a DPIV system, the DPIV data could be processed according to the proposed method to obtain particle sizes in addition to particle velocities.

Posted in: Information Sciences, Briefs

Read More >>

Simultaneous Product Development Provides a New Approach to Design Collaboration

This approach enables sharing and merging of any element of a digital model, and allows collaborative data to flow in any direction. Product development is becoming increasingly global and as a result, new challenges have emerged,such as coordinating geographically dispersed teams of suppliers and partners. Consequently, the modern enterprise is more similar to a network of interconnected nodes that work in parallel and need constant dynamic synchronization. The widespread adoption of digital design documents, the introduction of collaboration tools like Web-based review and mark-up, and Internet-accessible databases have made design data more readily available.

Posted in: Information Sciences, Briefs

Read More >>

White Papers

Next-Generation, Miniature High Voltage Power Modules
Sponsored by EMCO High Voltage
Rapidly Expanding Array of Test Applications Continues to Drive Source Measurement Unit Instrument T
Sponsored by Keithley Instruments
An Introduction to LED Capabilities
Sponsored by Photo Research
A Brief History of Modern Digital Shaker Controllers
Sponsored by Crystal Instruments
Maintenance Free Linear Guides
Sponsored by IKO
Unique Method for Orifice Production
Sponsored by Bird Precision

White Papers Sponsored By: