Imaging

Viewpoints Software for Visualization of Multivariate Data

Ames Research Center, Moffett Field, California Viewpoints software allows interactive visualization of multi-variate data using a variety of standard techniques. The software is built exclusively from high-performance, cross-platform, open-source, standards-compliant languages, libraries, and components. The techniques included are:

Posted in: Briefs, Data Acquisition, Visualization Software, Electronics & Computers, Data Acquisition, Mathematical/Scientific Software, Software

Read More >>

ORCA Prototype Ready to Observe Ocean

If selected for a NASA flight mission, the Ocean Radiometer for Carbon Assessment (ORCA) instrument will study microscopic phytoplankton, the tiny green plants that float in the upper layer of the ocean and make up the base of the marine food chain.Conceived in 2001 as the next technological step forward in observing ocean color, the ORCA-development team used funding from Goddard’s Internal Research and Development program and NASA’s Instrument Incubator Program (IIP) to develop a prototype. Completed in 2014, ORCA now is a contender as the primary instrument on an upcoming Earth science mission.The ORCA prototype has a scanning telescope designed to sweep across 2,000 kilometers (1,243 miles) of ocean at a time. The technology collects light reflected from the sea surface that then passes through a series of mirrors, optical filters, gratings, and lenses. The components direct the light onto an array of detectors that cover the full range of wavelengths.Instead of observing a handful of discrete bands at specific wavelengths reflected off the ocean, ORCA measures a range of bands, from 350 nanometers to 900 nanometers at five-nanometer resolution. The sensor will see the entire rainbow, including the color gradations of green that fade into blue. In addition to the hyperspectral bands, the instrument has three short-wave infrared bands that measure specific wavelengths between 1200 and 2200 nanometers for atmospheric applications.The NASA researchers will use ORCA to obtain more accurate measurements of chlorophyll concentrations, the size of a phytoplankton bloom, and how much carbon it holds. Detecting chlorophyll in various wavelengths also will allow the team to distinguish between types of phytoplankton. Suspended sediments in coastal regions could also be detected by the instrument.SourceAlso: Learn about a Ultra-Low-Maintenance Portable Ocean Power Station.

Posted in: News, Imaging, Optics, Photonics, Sensors, Measuring Instruments, Test & Measurement

Read More >>

Video Wall Monitors Space Station Science

Clarity™ Matrix LCD Video Wall System Planar Systems Beaverton, OR 866-475-2627 www.planar.com A Clarity Matrix video wall system was installed at NASA’s Payload Operations Integration Center (POIC) at Marshall Space Flight Center in Alabama to monitor and manage science being conducted on the International Space Station (ISS). The POIC has been operational since 2001 and, during that time, flight control and other center personnel have monitored and managed ISS mission progress using a mix of large-scale computer monitors and a complement of large-scale projection screens to view ISS activities and share information. In the newly renovated POIC, a video wall of 24 displays has been installed in front of and above the flight control positions. Operational since mid-2013, the video wall provides capabilities that enhance collaboration among the ground team and enable them to more efficiently help the ISS crew and researchers around the world to perform science on station.

Posted in: Application Briefs, Cameras, Displays/Monitors/HMIs, Imaging, Video, Data Acquisition

Read More >>

Algorithm for Estimating PRC Wavefront Errors from Shack-Hartmann Camera Images

Phase retrieval is used for the calibration and the fine-alignment of an optical system. NASA’s Jet Propulsion Laboratory, Pasadena, California Phase retrieval (PR) and Shack-Hartmann Sensor (SHS) are the two preferred methods of image-based wavefront sensing widely used in various optical testbeds, adaptive optical systems, and ground- and space-based telescopes. They are used to recover the phase information of an optical system from defocused point source images (PR) and focused point source or extended scene images (SHS). For example, the Terrestrial Planet Finder Coronagraph’s (TPF-C’s) High-Contrast Imaging Testbed (HCIT) uses a PR camera (PRC) to estimate, and subsequently correct, the phase error at the exit pupil of this optical system. Several other test-beds at JPL were, and will be, equipped with both a PRC and a Shack-Hartmann camera (SHC).

Posted in: Briefs, TSP, Cameras, Imaging, Optics, Sensors

Read More >>

New Navigation Software Cuts Self-Driving Car Costs

A new software system developed at the University of Michigan uses video game technology to help solve one of the most daunting hurdles facing self-driving and automated cars: the high cost of the laser scanners they use to determine their location.Ryan Wolcott, a U-M doctoral candidate in computer science and engineering, estimates that the new concept could shave thousands of dollars from the cost of these vehicles. The technology enables them to navigate using a single video camera, delivering the same level of accuracy as laser scanners at a fraction of the cost."The laser scanners used by most self-driving cars in development today cost tens of thousands of dollars, and I thought there must be a cheaper sensor that could do the same job," he said. "Cameras only cost a few dollars each and they're already in a lot of cars. So they were an obvious choice."Wolcott's system builds on the navigation systems used in other self-driving cars that are currently in development, including Google's vehicle. The navigation systems use three-dimensional laser scanning technology to create a real-time map of their environment, then compare that real-time map to a pre-drawn map stored in the system. By making thousands of comparisons per second, they are able to determine the vehicle's location within a few centimeters.The software converts the map data into a three-dimensional picture much like a video game. The car's navigation system can then compare these synthetic pictures with the real-world pictures streaming in from a conventional video camera.SourceAlso: See more Software tech briefs.

Posted in: News, Automotive, Cameras, Imaging, Lasers & Laser Systems, Photonics, Software

Read More >>

New Serenity Payload Detects Hostile Fire

Two government-developed sensors are working together to increase the security of deployed soldiers. The Firefly and Serenity sensors employ government developed algorithms, software, and hardware to locate hostile fire around a base. The technology, a joint effort between the Army Aviation Research, Development and Engineering Center, or AMRDEC, and the Army Research Lab, referred to as ARL, has been under development for more than a decade.

Posted in: News, Defense, Electronics & Computers, Cameras, Imaging, Optics, Photonics, Detectors, Sensors

Read More >>

Product of the Month: LED Light Engines for Large FOV Fluorescence Imaging Systems

Innovations in Optics, Inc. (Woburn, MA) offers high power LED Light Engines as excitation illuminators for large field-of-view fluorescent imagers used in life science instruments. LumiBright LE Light Engines feature patented non-imaging optics that direct LED light into a desired cone angle, while producing highly uniform output, both angularly and spatially. The two standard far-field half-angles are 20 and 40 degrees. Available peak LED wavelengths range from 365 nm in the ultraviolet through 970 nm in the near-infrared.

Posted in: Products, Products, Imaging, LEDs, Photonics

Read More >>

CCD Image Sensor

The KAI-08051 charge-coupled device (CCD) image sensor from ON Semiconductor (Phoenix, AZ) shares the same advanced 5.5 micron pixel architecture, 8 megapixel resolution, 15 frame per second readout rate, and 4/3 optical format as the existing KAI-08050 Image Sensor, but improves key performance parameters through the use of an improved amplifier design, newly optimized microlens structure, and new color filter pigments in both Bayer and Sparse color configurations.

Posted in: Products, Products, Imaging, Photonics, Sensors

Read More >>

Thermal Imaging Core

FLIR Systems, Inc. (Portland, OR) has announced its latest thermal imaging core, Muon™, which is designed specifically for OEMs capable of integrating uncooled FPAs into their own camera solutions. Muon is based on FLIR’s 17μ pitch Vandium Oxide (VOx) 640x512 or 336x256 FPAs and offers frame rates of 9Hz and up to 60Hz. Optimized for size, weight and power (SWaP), Muon has a form factor of 22 mm × 22 mm × 6 mm, a mass of less than 5 grams, and depending on the configuration, uses less than 300mW of power.

Posted in: Products, Products, Imaging, Photonics

Read More >>

Imaging Sensor Targets

Headwall (Fitchburg, MA) has announced the availability of a new hyperspectral imager targeting very high resolution spectral measurements of 0.1 nm over specific spectral ranges that yield indicators of vegetative fluorescence to measure plant health. The new sensor is based on Headwall's all-reflective concentric optical design that uses very precise, very high diffraction-efficiency gratings for simultaneous high spatial and spectral resolution of < 0.1nm across the spectral range of the instrument.

Posted in: Products, Products, Imaging, Photonics, Sensors

Read More >>

The U.S. Government does not endorse any commercial product, process, or activity identified on this web site.