Algorithm for Estimating PRC Wavefront Errors from Shack-Hartmann Camera Images

Phase retrieval is used for the calibration and the fine-alignment of an optical system. NASA’s Jet Propulsion Laboratory, Pasadena, California Phase retrieval (PR) and Shack-Hartmann Sensor (SHS) are the two preferred methods of image-based wavefront sensing widely used in various optical testbeds, adaptive optical systems, and ground- and space-based telescopes. They are used to recover the phase information of an optical system from defocused point source images (PR) and focused point source or extended scene images (SHS). For example, the Terrestrial Planet Finder Coronagraph’s (TPF-C’s) High-Contrast Imaging Testbed (HCIT) uses a PR camera (PRC) to estimate, and subsequently correct, the phase error at the exit pupil of this optical system. Several other test-beds at JPL were, and will be, equipped with both a PRC and a Shack-Hartmann camera (SHC).

Posted in: Briefs, TSP, Cameras, Optics, Sensors


Photo-Thermo-Refractive Glass Co-Doped with Luminescent Agents for All-Solid-State Microchip Lasers

Goddard Space Flight Center, Greenbelt, Maryland A proposed solid-state technology possesses photosensitivity that enables volume hologram recording and a high efficiency of luminescence, enabling stimulated emission. These features were used to record volume Bragg gratings and to demonstrate lasing under laser diode pumping for the same volume of glass. Moreover, a combination of dopants provides extremely wide luminescence bands, which enables both wideband optical processing and extremely short laser pulse generation. It is important that the whole design be incorporated in a single, monolithic piece of glass that excludes the opportunity for misalignment and sensitivity to vibrations. If developed, the compactness and reliability of such laser devices would find wide use in space or aeronautical applications.

Posted in: Briefs, TSP, Lasers & Laser Systems, Optics


Large Computer-Generated Hologram with Software-Generated Calibration Wavefront Map

This type of testing aspheric surfaces provides better imaging, lower mapping distortion, and much higher-quality substrates. Marshall Space Flight Center, Alabama This technology enables accurate calibration of a large Computer Generated Hologram (CGH) fabricated without great accuracy, such that the CGH still measures an aspheric surface to an excellent accuracy of a couple of nm rms. The goal is the creation of software for generating a calibration map, and the fabrication of a couple of 9-in. (≈22.5-cm)-diameter CGHs to experimentally verify the technology. Use of CGHs in testing aspheric surfaces provides many advantages, such as better imaging, lower mapping distortion, and much higher-quality substrates.

Posted in: Briefs, Optics, Electronics & Computers


Google Glass for Industrial Automation

A new concept uses Google Glass for operating machinery, with all of the benefits delivered by wearable computing in an industrial environment. With Google’s Web-enabled glasses, status or dialog messages can be projected via a head-up display directly into a person’s field of vision. Online information and communication is also possible with this innovative device, and error messages can be acknowledged using a touchpad.

Posted in: Articles, Optics, Machinery & Automation


New Navigation Software Cuts Self-Driving Car Costs

A new software system developed at the University of Michigan uses video game technology to help solve one of the most daunting hurdles facing self-driving and automated cars: the high cost of the laser scanners they use to determine their location.Ryan Wolcott, a U-M doctoral candidate in computer science and engineering, estimates that the new concept could shave thousands of dollars from the cost of these vehicles. The technology enables them to navigate using a single video camera, delivering the same level of accuracy as laser scanners at a fraction of the cost."The laser scanners used by most self-driving cars in development today cost tens of thousands of dollars, and I thought there must be a cheaper sensor that could do the same job," he said. "Cameras only cost a few dollars each and they're already in a lot of cars. So they were an obvious choice."Wolcott's system builds on the navigation systems used in other self-driving cars that are currently in development, including Google's vehicle. The navigation systems use three-dimensional laser scanning technology to create a real-time map of their environment, then compare that real-time map to a pre-drawn map stored in the system. By making thousands of comparisons per second, they are able to determine the vehicle's location within a few centimeters.The software converts the map data into a three-dimensional picture much like a video game. The car's navigation system can then compare these synthetic pictures with the real-world pictures streaming in from a conventional video camera.SourceAlso: See more Software tech briefs.

Posted in: News, Cameras, Lasers & Laser Systems, Photonics


Technology Diagnoses Brain Damage from Concussions, Strokes, and Dementia

New optical diagnostic technology developed at Tufts University School of Engineering promises new ways to identify and monitor brain damage resulting from traumatic injury, stroke, or vascular dementia in real time and without invasive procedures.

Posted in: News, Electronic Components, Diagnostics, Fiber Optics, Optics, Photonics, Measuring Instruments


New Serenity Payload Detects Hostile Fire

Two government-developed sensors are working together to increase the security of deployed soldiers. The Firefly and Serenity sensors employ government developed algorithms, software, and hardware to locate hostile fire around a base. The technology, a joint effort between the Army Aviation Research, Development and Engineering Center, or AMRDEC, and the Army Research Lab, referred to as ARL, has been under development for more than a decade.

Posted in: News, Cameras, Optics, Photonics, Detectors, Sensors