Greg Olsen on Imaging, Space Science, and the Future of Engineering
- Created on Tuesday, 01 December 2009
Dr. Greg Olsen has had an illustrious career as a research scientist and entrepreneur. With degrees in physics and materials science, Olsen founded EPITAXX, a fiber-optic detector manufacturer, in 1984, and then founded Sensors Unlimited Inc. (SUI), a near-infrared camera manufacturer in 1992. SUI was sold to Goodrich Corp. in 2005. During his career, Olsen developed vapor phase epitaxial crystal growth of optoelectronic devices, including laser diodes and photodetectors for fiber-optic applications based on the material indium gallium arsenide (InGaAs). He was awarded 12 patents, wrote more than 100 technical papers, co-authored several book chapters, and has given numerous lectures to both technical and trade journal audiences.
But Olsen may be best known for becoming the third private citizen to orbit the Earth on the International Space Station (ISS) in 2005. After training for five months at the Yuri Gagarin Cosmonaut Training Center in Moscow, he launched on a Russian Soyuz rocket, and two days later, docked to the ISS. He performed more than 150 orbits of the Earth and logged almost 4 million miles of weightless travel during his ten days in space.
Imaging Technology spoke to Olsen about SUI cameras on NASA’s LCROSS mission, his experience in space, and how he’s helping to advance engineering education.
Imaging the Moon
Two of SUI Goodrich’s SWIR-InGaAs cameras are part of NASA’s Lunar Crater Observation and Sensing Satellite (LCROSS) payload, with the main mission of confirming the presence or absence of water ice on the Moon. In October, the LCROSS spacecraft separated into two sections, with the Centaur rocket impacting the lunar surface, kicking up a large plume of dust. The shepherding spacecraft section followed, with the cameras, to image and analyze the resultant dust plume for water vapor, hydrocarbons, and hydrated materials.
The cameras recorded images of the debris, which were transmitted back to Earth in real time for evaluation. Because the SWIR cameras can detect moisture contrast through dust, smoke, and fog, they had the unique ability to accurately record the LCROSS crash incident for precise study of the debris cloud.
SWIR technology detects reflected light at wavelengths that the human eye cannot see, in wavelength bands between visible and thermal cameras. Olsen explained: “You can detect water in near-infrared. Water absorbs heat. If you want to think of it pictorially, water preferentially absorbs heat. So, if you’re looking at a picture that has water in it, with a near-infrared camera, the water will look black or dark, because it’s absorbing the light — it’s not reflecting it. An interesting application of our camera that NASA funded was for ice detection. Ice reflects heat more than liquid water does. So we could look at an aircraft wing and tell if its been properly de-iced by whether it’s dark or light.
“For our camera, in the near-infrared, the material it uses for detection is indium gallium arsenide. That material gives the best quantum efficiency in the 1- to 2-micron spectral region of any material. In that spectral region, it’s not a thermal imager the way you see pictures of hot objects at night. But it is good for night vision because at night, our eyes can’t see, but there’s a lot of near-infrared light — stars and other things. The Sensors Unlimited camera really images that well. That’s the advantage of it. I would say anything in the 1- to 2-micron region, that’s specifically water, which has a signature in that spectrum.”