The Hubble Space Telescope

25 Years of Challenges and Triumphs By Bruce A. Bennett On April 24, 1990, something happened that forever altered mankind’s view of the universe. It was on that day that the Hubble Space Telescope (HST) was launched into space aboard the Space Shuttle Discovery.

Posted in: Articles, Features, Cameras, Optical Components, Photonics


Interview with Jim Odom, Hubble Project Manager 1983-1986

Jim Odom served as project manager on the Hubble Space Telescope from 1983 to 1986. As the world celebrates Hubble’s 25th anniversary, Mr. Odom shared some of his experiences on the project.

Posted in: Articles, Podcasts, Features, Cameras, Photonics


Hubble Spinoffs: Space Age Technology for the Masses

By Bruce A. Bennett Over the plast 25 years, some of the sophisticated technology developed for the HST has been successfully spun off and commercialized to improve life on Earth.

Posted in: Articles, Features, Cameras, Photonics


3D Volumetric Display Technology

The United States government spends a lot of money on its defense programs, investing in the training and technology necessary to arm and prepare the most advanced fighting force on the planet. The price tag for these efforts reached $581 billion in 20141 as various branches of defense continued to dedicate funds toward the research and development of innovative tools and technology.

Posted in: Application Briefs, Applications, Displays/Monitors/HMIs, Visualization Software, Optics, Photonics, Simulation Software


How Do You Assess Image Quality?

What does "image quality" mean for you? What exactly differentiates a "good" from a "bad" image? How can image quality be measured in an industrial camera and which criteria are used to assess it?

Posted in: White Papers


ORCA Prototype Ready to Observe Ocean

If selected for a NASA flight mission, the Ocean Radiometer for Carbon Assessment (ORCA) instrument will study microscopic phytoplankton, the tiny green plants that float in the upper layer of the ocean and make up the base of the marine food chain.Conceived in 2001 as the next technological step forward in observing ocean color, the ORCA-development team used funding from Goddard’s Internal Research and Development program and NASA’s Instrument Incubator Program (IIP) to develop a prototype. Completed in 2014, ORCA now is a contender as the primary instrument on an upcoming Earth science mission.The ORCA prototype has a scanning telescope designed to sweep across 2,000 kilometers (1,243 miles) of ocean at a time. The technology collects light reflected from the sea surface that then passes through a series of mirrors, optical filters, gratings, and lenses. The components direct the light onto an array of detectors that cover the full range of wavelengths.Instead of observing a handful of discrete bands at specific wavelengths reflected off the ocean, ORCA measures a range of bands, from 350 nanometers to 900 nanometers at five-nanometer resolution. The sensor will see the entire rainbow, including the color gradations of green that fade into blue. In addition to the hyperspectral bands, the instrument has three short-wave infrared bands that measure specific wavelengths between 1200 and 2200 nanometers for atmospheric applications.The NASA researchers will use ORCA to obtain more accurate measurements of chlorophyll concentrations, the size of a phytoplankton bloom, and how much carbon it holds. Detecting chlorophyll in various wavelengths also will allow the team to distinguish between types of phytoplankton. Suspended sediments in coastal regions could also be detected by the instrument.SourceAlso: Learn about a Ultra-Low-Maintenance Portable Ocean Power Station.

Posted in: News, Optics, Photonics, Sensors, Measuring Instruments


Algorithm for Estimating PRC Wavefront Errors from Shack-Hartmann Camera Images

Phase retrieval is used for the calibration and the fine-alignment of an optical system. NASA’s Jet Propulsion Laboratory, Pasadena, California Phase retrieval (PR) and Shack-Hartmann Sensor (SHS) are the two preferred methods of image-based wavefront sensing widely used in various optical testbeds, adaptive optical systems, and ground- and space-based telescopes. They are used to recover the phase information of an optical system from defocused point source images (PR) and focused point source or extended scene images (SHS). For example, the Terrestrial Planet Finder Coronagraph’s (TPF-C’s) High-Contrast Imaging Testbed (HCIT) uses a PR camera (PRC) to estimate, and subsequently correct, the phase error at the exit pupil of this optical system. Several other test-beds at JPL were, and will be, equipped with both a PRC and a Shack-Hartmann camera (SHC).

Posted in: Briefs, TSP, Cameras, Optics, Sensors


Video Wall Monitors Space Station Science

Clarity™ Matrix LCD Video Wall System Planar Systems Beaverton, OR 866-475-2627 A Clarity Matrix video wall system was installed at NASA’s Payload Operations Integration Center (POIC) at Marshall Space Flight Center in Alabama to monitor and manage science being conducted on the International Space Station (ISS). The POIC has been operational since 2001 and, during that time, flight control and other center personnel have monitored and managed ISS mission progress using a mix of large-scale computer monitors and a complement of large-scale projection screens to view ISS activities and share information. In the newly renovated POIC, a video wall of 24 displays has been installed in front of and above the flight control positions. Operational since mid-2013, the video wall provides capabilities that enhance collaboration among the ground team and enable them to more efficiently help the ISS crew and researchers around the world to perform science on station.

Posted in: Application Briefs, Cameras, Displays/Monitors/HMIs, Video, Data Acquisition


New Navigation Software Cuts Self-Driving Car Costs

A new software system developed at the University of Michigan uses video game technology to help solve one of the most daunting hurdles facing self-driving and automated cars: the high cost of the laser scanners they use to determine their location.Ryan Wolcott, a U-M doctoral candidate in computer science and engineering, estimates that the new concept could shave thousands of dollars from the cost of these vehicles. The technology enables them to navigate using a single video camera, delivering the same level of accuracy as laser scanners at a fraction of the cost."The laser scanners used by most self-driving cars in development today cost tens of thousands of dollars, and I thought there must be a cheaper sensor that could do the same job," he said. "Cameras only cost a few dollars each and they're already in a lot of cars. So they were an obvious choice."Wolcott's system builds on the navigation systems used in other self-driving cars that are currently in development, including Google's vehicle. The navigation systems use three-dimensional laser scanning technology to create a real-time map of their environment, then compare that real-time map to a pre-drawn map stored in the system. By making thousands of comparisons per second, they are able to determine the vehicle's location within a few centimeters.The software converts the map data into a three-dimensional picture much like a video game. The car's navigation system can then compare these synthetic pictures with the real-world pictures streaming in from a conventional video camera.SourceAlso: See more Software tech briefs.

Posted in: News, Cameras, Lasers & Laser Systems, Photonics


New Serenity Payload Detects Hostile Fire

Two government-developed sensors are working together to increase the security of deployed soldiers. The Firefly and Serenity sensors employ government developed algorithms, software, and hardware to locate hostile fire around a base. The technology, a joint effort between the Army Aviation Research, Development and Engineering Center, or AMRDEC, and the Army Research Lab, referred to as ARL, has been under development for more than a decade.

Posted in: News, Cameras, Optics, Photonics, Detectors, Sensors