Compact Thermal Neutron Imaging System Using Axisymmetric Focusing Mirrors

This technology uses grazing incidence reflective optics to produce focused beams of neutrons from commercially available sources. Marshall Space Flight Center, Alabama NASA’s Marshall Space Flight Center has developed novel neutron grazing incidence optics for use with small-scale portable neutron generators. The technology was developed to enable the use of commercially available neutron generators for applications requiring high flux densities, including high-performance imaging and analysis. Nested grazing incidence mirror optics, with high collection efficiency, are used to produce divergent, parallel, or convergent neutron beams. Ray tracing simulations of the system (with source-object separation of 10 m for 5 meV neutrons) show nearly an order of magnitude neutron flux increase on a 1-mm-diameter object. The technology is a result of joint development efforts between NASA and MIT researchers seeking to maximize neutron flux from diffuse sources for imaging and testing applications.

Posted in: Briefs, Imaging, Mirrors, Imaging and visualization


High-Speed Edge-Detecting Circuit for Use with Linear Image Sensor

Applications include supersonic jets, manufacturing, lane line tracking for vehicle control, bar code scanners, and digital photography. John H. Glenn Research Center, Cleveland, Ohio A new smart camera developed at NASA’s Glenn Research Center has the ability to process and transmit valuable edge location data for the images that it captures — at a rate of over 900 frames per second. The camera was designed to operate as a component in an inlet shock detection system for supersonic jets. A supersonic jet cannot function properly unless the airflow entering the machine is compressed and slowed to subsonic speed in the inlet before it reaches the engine. When supersonic air is compressed, it forms shock waves that can destroy the turbofan and surrounding components unless they are pinpointed and adjusted. This smart camera uses an edge detection signal processing circuit to determine the exact location of shock waves, and sends the location information via an onboard microcontroller or external digital interface. This highly customizable camera’s ability to quickly identify precise location data makes it ideal for a variety of other applications where high-speed edge detection is needed.

Posted in: Briefs, Imaging, Imaging and visualization, Sensors and actuators, Hypersonic and supersonic aircraft


Two- and Three-Dimensional Near-Infrared Subcutaneous Structure Imager Using Adaptive Nonlinear Video Processing

The battery-powered system uses off-the-shelf near-infrared technology that is not affected by melanin content, and can also operate in dark environments. John H. Glenn Research Center, Cleveland, Ohio Scientists at NASA’s Glenn Research Center have successfully developed a novel subcutaneous structure imager for locating veins in challenging patient populations, such as juvenile, elderly, dark-skinned, or obese patients. Spurred initially by the needs of pediatric sickle-cell anemia patients in Africa, Glenn’s groundbreaking system includes a camera-processor-display apparatus and uses an innovative image-processing method to provide two- or three-dimensional, high-contrast visualization of veins or other vasculature structures. In addition to assisting practitioners to find veins in challenging populations, this system can also help novice healthcare workers locate veins for procedures such as needle insertion or excision. Compared to other state-of-the-art solutions, the imager is inexpensive, compact, and very portable, so it can be used in remote third-world areas, emergency response situations, or military battlefields.

Posted in: Briefs, Imaging, Imaging and visualization, Cardiovascular system, Medical equipment and supplies


Methods of Real-Time Image Enhancement of Flash LIDAR Data and Navigating a Vehicle Using Flash LIDAR Data

Applications include robotic ground vehicle collision avoidance, topographical/terrain mapping, and automotive adaptive cruise control. Langley Research Center, Hampton, Virginia The original (left) and enhanced resolution Flash LIDAR images. NASA’s Langley Research Center has developed 3D imaging technologies (Flash LIDAR) for real-time terrain mapping and synthetic vision-based navigation. To take advantage of the information inherent in a sequence of 3D images acquired at video rates, NASA Langley has also developed an embedded image-processing algorithm that can simultaneously correct, enhance, and derive relative motion by processing this image sequence into a high-resolution 3D synthetic image. Traditional scanning LIDAR techniques generate an image frame by raster scanning an image one laser pulse per pixel at a time, whereas Flash LIDAR acquires an image much like an ordinary camera, generating an image using a single laser pulse. The benefits of the Flash LIDAR technique and the corresponding image-to-image processing enable autonomous vision-based guidance and control for robotic systems. The current algorithm offers up to eight times image resolution enhancement, as well as a 6-degree-of-freedom state vector of motion in the image frame.

Posted in: Briefs, Imaging, Mathematical models, Imaging and visualization, Lidar, Navigation and guidance systems, Robotics


Spatially Aberrated Spectral Filtering for High-Performance Spectral Imaging

This innovation has application in the biomedical research, semiconductor, and analysis/characterization fields. NASA’s Jet Propulsion Laboratory, Pasadena, California High-performance thermal imagers like Mars Climate Sounder (MCS) on the Mars Reconnaissance Orbiter (MRO) and the Diviner Lunar Radiometer Experiment on the Lunar Reconnaissance Orbiter (LRO) currently use a three-mirror anastigmat (TMA) optical design to image remote targets. A TMA telescope is built with three curved mirrors, enabling it to minimize all three main optical aberrations: spherical aberration, coma, and astigmatism. This is primarily used to enable wide fields of view, much larger than possible with telescopes with just one or two curved surfaces.

Posted in: Briefs, Imaging, Mirrors, Imaging and visualization, Spacecraft


A Common-Mode Digital Holographic Microscope

This instrument has no moving parts and allows scientists to image in 3D and in real time. NASA’s Jet Propulsion Laboratory, Pasadena, California Digital holography is a fast-growing field in optics, recently spurred by the advent of large-format digital cameras and high-speed computers. This method provides a time-series of volumetric information about a sample, but the instrument itself has no moving parts. It does not compromise performance such as image quality and spatial resolution. However, these systems are typically implemented as optical interferometers with two separate beam paths: one is the reference beam and the other is the science beam. Interferometers are sensitive instruments that are subject to misalignment, and they will have significantly reduced performance in the presence of mechanical vibrations.

Posted in: Briefs, Imaging, Microscopy


Introduction to Machine Vision

A guide to automating process & quality improvements Get the basics of how machine vision technology works and why it's the right choice for automating process and quality improvements. The Introduction to Machine Vision whitepaper is the first step to understanding, what is machine vision, what kind of problems does it solve, what components do you need to build a vision system, how to get the most out of your vision system, and more. Read this whitepaper to see why automated inspection is vastly superior to manual techniques.

Posted in: White Papers, Machine Vision, Automation


Boosting Machine Vision with Built-in FPGA Image Preprocessing

Since imaging processing tasks can consume major CPU resources in machine vision applications, increasing processing performance within size constraints is, accordingly, a common challenge for solution providers. The following discusses the efficacy of FPGA in addressing such performance shortcomings, presents the image processing tasks most suitable for FPGA, and compares the capabilities of CPU and FPGA in operation. A built -in FPGA image preprocessing solution supporting machine vision app lications is then presented.

Posted in: White Papers, Imaging, Optics, Photonics, Automation, Robotics


NASA’s Infrared Sensor Spots Near-Earth Asteroids

The Near-Earth Object Camera (NEOCam) is part of a proposed NASA mission to find potentially hazardous asteroids. In a Q&A with Photonics & Imaging Technology, NEOCam principal investigator Amy Mainzer ex plains how the NEOCam chip, a stamp-sized mega pixel infrared sensor, detects the faint heat emitted by near-Earth objects circling the Sun.

Posted in: Articles, Features, Imaging, Photonics, Imaging and visualization, Sensors and actuators, Spacecraft


Advanced Digital Microscopes Providing Simple Solutions to Common Microscopy Issues

Thanks to a combination of high-quality optics and advanced digital imaging technology, today’s newest digital microscopes provide efficient solutions to a variety of common microscope challenges faced by users of conventional optical and digital microscopes. The following represent 10 conventional microscope issues and 10 solutions made possible with current digital microscope technology.

Posted in: Articles, Features, Imaging, Photonics, Microscopy


The U.S. Government does not endorse any commercial product, process, or activity identified on this web site.