Special Coverage

Supercomputer Cooling System Uses Refrigerant to Replace Water
Computer Chips Calculate and Store in an Integrated Unit
Electron-to-Photon Communication for Quantum Computing
Mechanoresponsive Healing Polymers
Variable Permeability Magnetometer Systems and Methods for Aerospace Applications
Evaluation Standard for Robotic Research
Small Robot Has Outstanding Vertical Agility
Smart Optical Material Characterization System and Method
Lightweight, Flexible Thermal Protection System for Fire Protection

Low-Energy-Curing Technology

A company seeks new and/or advanced curing technologies to incorporate into metal packaging protective coating and decoration processes. Systems could include metal containers for aerosols, beverages, food products, and specialty packaging, all of which require external and/or internal protective coatings. Process improvement may be obtained by upgrading the efficiency of the existing thermal curing gas ovens or introducing new coating materials that can be cured more efficiently using the existing ovens.

Posted in: NASA Tech Needs


Inhibit or Deactivate Plant Enzymes

An organization wants to deactivate or otherwise interfere with the action of enzymes pectin methylesterase and polygalacturonase, which interfere with the processing of certain fruits by degrading fruit pectins. Initial processing of the whole fruit releases the enzymes and release continues during further processing. The release of the enzymes decreases the thickening power of the fruit solids. By deactivating the enzymes, especially without using heat, more of the fruit solids thickening power may be conserved during processing. Greater retention of pectin in its natural state may also provide added benefits in serum binding and overall consumer acceptance.

Posted in: NASA Tech Needs


Open Innovation Model Helps P&G “Connect and Develop”

By Ed Getty Research Fellow Procter & Gamble Cincinnati, OHFor almost 165 of Procter & Gamble’s 170-year history, nearly all growth came from innovating within the walls of our own R&D organization. In 2000, our newly appointed CEO, A.G. Lafley, realized that P&G’s “invent it ourselves” model was not capable of sustaining high levels of top-line growth. The picture was becoming clear:

Posted in: Articles, Software


Ka-Band TWT High-Efficiency Power Combiner for High-Rate Data Transmission

Two-way combiner waveguide circuit can be concatenated for 2n-way combining. A four-port magic-T hybrid waveguide junction serves as the central component of a high-efficiency two-way power combiner circuit for transmitting a high-rate phase-modulated digital signal at a carrier frequency in the Ka-band (between 27 and 40 GHz). This power combiner was developed to satisfy a specific requirement to efficiently combine the coherent outputs of two traveling-wave-tube (TWT) amplifiers that are typically characterized by power levels on the order of 100 W or more. In this application, the use of a waveguide-based power combiner (instead of a coaxial-cable- or microstrip-based power combiner, for example) is dictated by requirements for low loss, high powerhandling capability, and broadband response. Combiner efficiencies were typically 90 percent or more over both the linear and saturated output power regions of operation of the TWTs.

Posted in: Briefs, Electronics & Computers, Data exchange, Waveguides


Reusable, Extensible High-Level Data-Distribution Concept

Users can optimize distributions for parallel computing, without concern for tedious details. A framework for high-level specification of data distributions in data-parallel application programs has been conceived. [As used here, “distributions” signifies means to express locality (more specifically, locations of specified pieces of data) in a computing system composed of many processor and memory components connected by a network.] Inasmuch as distributions exert a great effect on the performances of application programs, it is important that a distribution strategy be flexible, so that distributions can be adapted to the requirements of those programs. At the same time, for the sake of productivity in programming and execution, it is desirable that users be shielded from such error-prone, tedious details as those of communication and synchronization.

Posted in: Briefs, TSP, Information Sciences, Data management


Monitoring by Use of Clusters of Sensor-Data Vectors

Incoming data vectors are compared with clustered vectors representative of normal operation. The inductive monitoring system (IMS) is a system of computer hardware and software for automated monitoring of the performance, operational condition, physical integrity, and other aspects of the “health” of a complex engineering system (e.g., an industrial process line or a spacecraft). The input to the IMS consists of streams of digitized readings from sensors in the monitored system. The IMS determines the type and amount of any deviation of the monitored system from a nominal or normal (“healthy”) condition on the basis of a comparison between (1) vectors constructed from the incoming sensor data and (2) corresponding vectors in a database of nominal or normal behavior. The term “inductive” reflects the use of a process reminiscent of traditional mathematical induction to “learn” about normal operation and build the nominal-condition database. The IMS offers two major advantages over prior computational monitoring systems: The computational burden of the IMS is significantly smaller, and there is no need for abnormal-condition sensor data for training the IMS to recognize abnormal conditions.

Posted in: Briefs, TSP, Information Sciences, Computer software and hardware, Sensors and actuators, Vehicle health management


Processing Satellite Imagery To Detect Waste Tire Piles

Less time is needed for searching for previously unidentified piles. A methodology for processing commercially available satellite spectral imagery has been developed to enable identification and mapping of waste tire piles in California. The California Integrated Waste Management Board initiated the project and provided funding for the method’s development. The methodology includes the use of a combination of previously commercially available image-processing and georeferencing software used to develop a model that specifically distinguishes between tire piles and other objects. The methodology reduces the time that must be spent to initially survey a region for tire sites, thereby increasing inspectors’ and managers’ time available for remediation of the sites. Remediation is needed because millions of used tires are discarded every year, waste tire piles pose fire hazards, and mosquitoes often breed in water trapped in tires. It should be possible to adapt the methodology to regions outside California by modifying some of the algorithms implemented in the software to account for geographic differences in spectral characteristics associated with terrain and climate.

Posted in: Briefs, TSP, Information Sciences, Tires and traction, Imaging and visualization, Waste materials, Satellites


The U.S. Government does not endorse any commercial product, process, or activity identified on this web site.