Tech Briefs

Reducing Centroid Error Through Model-Based Noise Reduction

Corrections are made for bias and noise.A method of processing the digitized output of a charge-coupled device (CCD) image detector has been devised to enable reduction of the error in computed centroid of the image of a point source of light. The method involves model-based estimation of, and correction for, the contributions of bias and noise to the image data. The method could be used to advantage in any of a variety of applications in which there are requirements for measuring precise locations of, and/or precisely aiming optical instruments toward, point light sources.

Posted in: Information Sciences, Briefs

Read More >>

Templates for Fabricating Nanowire/Nanoconduit-Based Devices

Prior templating processes are being extended to finer spatial resolutions. An effort is underway to develop processes for making templates that could be used as deposition molds and etching masks in the fabrication of devices containing arrays of nanowires and/or nanoconduits. Examples of such devices include thermoelectric devices, nerve guidance scaffolds for nerve repair, photonic-band-gap devices, filters for trapping microscopic particles suspended in liquids, microfluidic devices, and size-selective chemical sensors. The technology is an extension of previous work conducted by JPL, UCSD (University of California, San Diego), and Paradigm Optics Inc., which developed a process to fabricate macroporous scaffolds for spinal-cord repair.

Posted in: Manufacturing & Prototyping, Briefs

Read More >>

Measuring Vapors To Monitor the State of Cure of a Resin

Excess curing time would no longer be needed as margin against uncertainty. A proposed noninvasive method of monitoring the cure path and the state of cure of an epoxy or other resin involves measurement of the concentration(s) of one or more compound(s) in the vaporous effluent emitted during the curing process. The method is based on the following general ideas:

Posted in: Manufacturing & Prototyping, Briefs

Read More >>

Statistical Evaluation of Utilization of the ISS

PayLoad Utilization Modeler (PLUM) is a statistical-modeling computer program used to evaluate the effectiveness of utilization of the International Space Station (ISS) in terms of the number of research facilities that can be operated within a specified interval of time. PLUM is designed to balance the requirements of research facilities aboard the ISS against the resources available on the ISS. PLUM comprises three parts: an interface for the entry of data on constraints and on required and available resources, a database that stores these data as well as the program output, and a modeler. The modeler comprises two subparts: one that generates tens of thousands of random combinations of research facilities and another that calculates the usage of resources for each of those combinations. The results of these calculations are used to generate graphical and tabular reports to determine which facilities are most likely to be operable on the ISS, to identify which ISS resources are inadequate to satisfy the demands upon them, and to generate other data useful in allocation of and planning of resources.

Posted in: Software, Briefs

Read More >>

Shuttle Data Center File- Processing Tool in Java

A Java-language computer program has been written to facilitate mining of data in files in the Shuttle Data Center (SDC) archives. This program can be executed on a variety of workstations or via Web-browser programs. This program is partly similar to prior C-language programs used for the same purpose, while differing from those programs in that it exploits the platform neutrality of Java in implementing several features that are important for analysis of large sets of time-series data. The program supports regular expression queries of SDC archive files, reads the files, interleaves the time-stamped samples according to a chosen output, then transforms the results into that format. A user can choose among a variety of output file formats that are useful for diverse purposes, including plotting, Markov modeling, multivariate density estimation, and wavelet multiresolution analysis, as well as for playback of data in support of simulation and testing.

Posted in: Software, Briefs

Read More >>

X-Windows PVT Widget Class

The X-Windows Process Validation Table (PVT) Widget Class (“Class” is used here in the object oriented programming sense of the word) was devised to simplify the task of implementing network registration services for Information Sharing Protocol (ISP) graphical user interface (GUI) computer programs. Heretofore, ISP PVT programming tasks have required many method calls to identify, query, and interpret the connections and messages exchanged between a client and a PVT server. Normally, programmers have utilized direct access to UNIX socket libraries to implement the PVT protocol queries, necessitating the use of many lines of source code to perform frequent tasks. Now, the X-Windows PVT Widget Class encapsulates ISP client server network registration management tasks within the framework of an X Windows widget. Use of the widget framework enables an X Windows GUI program to interact with PVT services in an abstract way and in the same manner as that of other graphical widgets, making it easier to program PVT clients. Wrapping the PVT services inside the widget framework enables a programmer to treat a PVT server interface as though it were a GUI. Moreover, an alternate subclass could implement another service in a widget of the same type.

Posted in: Software, Briefs

Read More >>

Using Dissimilarity Metrics to Identify Interesting Designs

A computer program helps to blend the power of automated-search software, which is able to generate large numbers of design solutions, with the insight of expert designers, who are able to identify preferred designs but do not have time to examine all the solutions. From among the many automated solutions to a given design problem, the program selects a smaller number of solutions that are worthy of scrutiny by the experts in the sense that they are sufficiently dissimilar from each other. The program makes the selection in an interactive process that involves a sequence of datamining steps interspersed with visual displays of results of these steps to the experts. At crucial points between steps, the experts provide directives to guide the process. The program uses heuristic search techniques to identify nearly optimal design solutions and uses dissimilarity metrics defined by the experts to characterize the degree to which solutions are interestingly different. The search, data-mining, and visualization features of the program were derived from previously developed risk-management software used to support a risk-centric design methodology.

Posted in: Software, Briefs

Read More >>