Imaging

ORCA Prototype Ready to Observe Ocean

If selected for a NASA flight mission, the Ocean Radiometer for Carbon Assessment (ORCA) instrument will study microscopic phytoplankton, the tiny green plants that float in the upper layer of the ocean and make up the base of the marine food chain.Conceived in 2001 as the next technological step forward in observing ocean color, the ORCA-development team used funding from Goddard’s Internal Research and Development program and NASA’s Instrument Incubator Program (IIP) to develop a prototype. Completed in 2014, ORCA now is a contender as the primary instrument on an upcoming Earth science mission.The ORCA prototype has a scanning telescope designed to sweep across 2,000 kilometers (1,243 miles) of ocean at a time. The technology collects light reflected from the sea surface that then passes through a series of mirrors, optical filters, gratings, and lenses. The components direct the light onto an array of detectors that cover the full range of wavelengths.Instead of observing a handful of discrete bands at specific wavelengths reflected off the ocean, ORCA measures a range of bands, from 350 nanometers to 900 nanometers at five-nanometer resolution. The sensor will see the entire rainbow, including the color gradations of green that fade into blue. In addition to the hyperspectral bands, the instrument has three short-wave infrared bands that measure specific wavelengths between 1200 and 2200 nanometers for atmospheric applications.The NASA researchers will use ORCA to obtain more accurate measurements of chlorophyll concentrations, the size of a phytoplankton bloom, and how much carbon it holds. Detecting chlorophyll in various wavelengths also will allow the team to distinguish between types of phytoplankton. Suspended sediments in coastal regions could also be detected by the instrument.SourceAlso: Learn about a Ultra-Low-Maintenance Portable Ocean Power Station.

Posted in: News, Optics, Photonics, Sensors, Measuring Instruments

Read More >>

Algorithm for Estimating PRC Wavefront Errors from Shack-Hartmann Camera Images

Phase retrieval is used for the calibration and the fine-alignment of an optical system. NASA’s Jet Propulsion Laboratory, Pasadena, California Phase retrieval (PR) and Shack-Hartmann Sensor (SHS) are the two preferred methods of image-based wavefront sensing widely used in various optical testbeds, adaptive optical systems, and ground- and space-based telescopes. They are used to recover the phase information of an optical system from defocused point source images (PR) and focused point source or extended scene images (SHS). For example, the Terrestrial Planet Finder Coronagraph’s (TPF-C’s) High-Contrast Imaging Testbed (HCIT) uses a PR camera (PRC) to estimate, and subsequently correct, the phase error at the exit pupil of this optical system. Several other test-beds at JPL were, and will be, equipped with both a PRC and a Shack-Hartmann camera (SHC).

Posted in: Briefs, TSP, Cameras, Optics, Sensors

Read More >>

Video Wall Monitors Space Station Science

Clarity™ Matrix LCD Video Wall System Planar Systems Beaverton, OR 866-475-2627 www.planar.com A Clarity Matrix video wall system was installed at NASA’s Payload Operations Integration Center (POIC) at Marshall Space Flight Center in Alabama to monitor and manage science being conducted on the International Space Station (ISS). The POIC has been operational since 2001 and, during that time, flight control and other center personnel have monitored and managed ISS mission progress using a mix of large-scale computer monitors and a complement of large-scale projection screens to view ISS activities and share information. In the newly renovated POIC, a video wall of 24 displays has been installed in front of and above the flight control positions. Operational since mid-2013, the video wall provides capabilities that enhance collaboration among the ground team and enable them to more efficiently help the ISS crew and researchers around the world to perform science on station.

Posted in: Application Briefs, Cameras, Displays/Monitors/HMIs, Video, Data Acquisition

Read More >>

New Navigation Software Cuts Self-Driving Car Costs

A new software system developed at the University of Michigan uses video game technology to help solve one of the most daunting hurdles facing self-driving and automated cars: the high cost of the laser scanners they use to determine their location.Ryan Wolcott, a U-M doctoral candidate in computer science and engineering, estimates that the new concept could shave thousands of dollars from the cost of these vehicles. The technology enables them to navigate using a single video camera, delivering the same level of accuracy as laser scanners at a fraction of the cost."The laser scanners used by most self-driving cars in development today cost tens of thousands of dollars, and I thought there must be a cheaper sensor that could do the same job," he said. "Cameras only cost a few dollars each and they're already in a lot of cars. So they were an obvious choice."Wolcott's system builds on the navigation systems used in other self-driving cars that are currently in development, including Google's vehicle. The navigation systems use three-dimensional laser scanning technology to create a real-time map of their environment, then compare that real-time map to a pre-drawn map stored in the system. By making thousands of comparisons per second, they are able to determine the vehicle's location within a few centimeters.The software converts the map data into a three-dimensional picture much like a video game. The car's navigation system can then compare these synthetic pictures with the real-world pictures streaming in from a conventional video camera.SourceAlso: See more Software tech briefs.

Posted in: News, Cameras, Lasers & Laser Systems, Photonics

Read More >>

New Serenity Payload Detects Hostile Fire

Two government-developed sensors are working together to increase the security of deployed soldiers. The Firefly and Serenity sensors employ government developed algorithms, software, and hardware to locate hostile fire around a base. The technology, a joint effort between the Army Aviation Research, Development and Engineering Center, or AMRDEC, and the Army Research Lab, referred to as ARL, has been under development for more than a decade.

Posted in: News, Cameras, Optics, Photonics, Detectors, Sensors

Read More >>

Tiny Camera Lets NASA Inspection Tool “See”

micro ScoutCam‘ 1.2 micro camera Medigus, Ltd. Omer, Israel 011 972 8646 6880 www.medigus.com NASA has incorporated the micro ScoutCam 1.2 into its Visual Inspection Poseable Invertebrate Robot (VIPIR) tool. VIPIR is a robotic, maneuverable, borescope inspection tool being tested as part of the Robotic Refueling Mission, an experiment on the International Space Station that has been demonstrating tools, technologies, and techniques for on-orbit satellite servicing since 2011.

Posted in: Application Briefs, Articles, Cameras, Robotics

Read More >>

Head-Worn Display Concepts for Ground Operations for Commercial Aircraft

This display enables a higher level of safety during ground operations, including taxiway navigation and situational awareness. Langley Research Center, Hampton, Virginia The Integrated Intelligent Flight Deck (IIFD) project, part of NASA’s Aviation Safety Program (AvSP), comprises a multi-disciplinary research effort to develop flight deck technologies that mitigate operator-, automation-, and environment-induced hazards. Toward this objective, the IIFD project is developing crew/vehicle interface technologies that reduce the propensity for pilot error, minimize the risks associated with pilot error, and proactively overcome aircraft safety barriers that would otherwise constrain the next full realization of the Next Generation Air Transportation System (NextGen). Part of this research effort involves the use of synthetic and enhanced vision systems and advanced display media as enabling crew-vehicle interface technologies to meet these safety challenges.

Posted in: Articles, Briefs, TSP, Aviation, Displays/Monitors/HMIs

Read More >>

Flight Imagery Recorder Locator (FIRLo) and High-Temperature Radome

This technology is applicable to the commercial airline industry for locating “black boxes.” NASA’s Jet Propulsion Laboratory, Pasadena, California LDSD (Low Density Supersonic Decelerator) is a Mars EDL (entry, descent, and landing) Technology Development Project that launches three test vehicles out of the Pacific Missile Range Facility in Kauai. On the test vehicle, most mission science data can be recorded safely on land; however, high-speed and high-resolution imagery cannot be telemetered due to bandwidth constraints. Therefore, all information had to be recorded solely onboard the test vehicle; this unit is called the flight imagery recorder (FIR). A typical commercial airliner “black box” is only capable of recording on the order of gigabytes of data, whereas this work required on the order of terabytes (a few orders of magnitude larger).

Posted in: Articles, Briefs

Read More >>

Product of the Month: LED Light Engines for Large FOV Fluorescence Imaging Systems

Innovations in Optics, Inc. (Woburn, MA) offers high power LED Light Engines as excitation illuminators for large field-of-view fluorescent imagers used in life science instruments. LumiBright LE Light Engines feature patented non-imaging optics that direct LED light into a desired cone angle, while producing highly uniform output, both angularly and spatially. The two standard far-field half-angles are 20 and 40 degrees. Available peak LED wavelengths range from 365 nm in the ultraviolet through 970 nm in the near-infrared.

Posted in: Products, Products, LEDs, Photonics

Read More >>

CCD Image Sensor

The KAI-08051 charge-coupled device (CCD) image sensor from ON Semiconductor (Phoenix, AZ) shares the same advanced 5.5 micron pixel architecture, 8 megapixel resolution, 15 frame per second readout rate, and 4/3 optical format as the existing KAI-08050 Image Sensor, but improves key performance parameters through the use of an improved amplifier design, newly optimized microlens structure, and new color filter pigments in both Bayer and Sparse color configurations.

Posted in: Products, Products, Photonics, Sensors

Read More >>