The term Space Imaging covers a wide variety of mission types and technologies.

Three Projects Illustrate the Range

The first is an upgrade of the Mastcam cameras used on NASA's Mars Curiosity rover. The two new Mastcam-Z cameras will be traveling to Mars on the next trip, which is scheduled for 2020.They will be able to take detailed three-dimensional images of the rocks and minerals on Mars’ surface and stich them together to form panoramic views.

The goal of PLATO — Exoplanets that could support life.

The second project will improve upon the Kepler space observatory mission to find earth-sized exoplanets orbiting stars in the Milky Way. The Planetary Transits and Oscillation of stars (PLATO) mission, to be launched in 2026, will carry 26 imaging telescopes, as compared to the single one on Kepler. Teledyne e2v is manufacturing advanced imaging sensors for the mission.

Number three is the Segmented Planar Imaging Detector for Electro-optical Reconnaissance (SPIDER). This is a joint program of DARPA and Lockheed Martin to develop an ultra-thin telescope for use on satellites to do high-precision imaging of earth.

Mastcam-Z

The drawback to the cameras on the Curiosity rover is that each of the two has a different fixed-focus lens. The “left eye” is a 34 mm wide angle lens and the “right eye” is a narrow angle 100 mm lens. “With a camera system like that, it's very challenging to do stereo. You have to degrade the resolution of the high-resolution camera to match that of the low. With a zoom camera in each eye, however, we can match the focal length between the left and right eyes and get routine stereo at scales from wide angle all the way to high resolution — that's a big change,” said Jim Bell, principal Investigator, Mars 2020 Mastcam-Z investigation. It will come in handy to be able to routinely use stereo at high resolution for decisions about drive directions, about how and where to put the arm down onto targets, and for decisions on where to drill and core samples. “We also need it for science reasons — for understanding what shape and texture tell us about the evolution of the rocks and soils,” said Bell. The 3D imaging enabled by the stereo will give us a detailed understanding of the shape and texture of the surface features of the planet.

The second modification, based on what was learned from the Curiosity mastcam, was “tweaking” the wavelengths of its built-in filters. This will increase sensitivity to different kinds of iron-bearing minerals, so as to discriminate among iron oxides, oxyhydroxides, and some iron-bearing sulfates, as well as some unoxidized iron bearing silicates that are very typical on terrestrial planets. Mars is a terrestrial planet like the earth and has similar kinds of minerology in its crust. The minerals on earth or any other planetary surface or body provide a way to understand the environmental conditions on that surface over time. “So especially in the case of clays or certain iron oxides that can only form in water for example, they tell us whether there was water in that environment,” said Bell.

Aside from the zoom lenses, you can't tell it from its predecessor. According to Bell, they were required to fit into the same envelope. The 2020 Rover is using the same mast and 90% of the spare parts of Curiosity. “Since NASA is very risk-averse we proposed making very few changes rather than a radical new design,” said Bell.

The prototype will be put through temperature cycling, shock, and vibration tests exceeding the limits experienced on past Mars missions. The equipment will experience very extreme shock and vibration on launch and again on landing — thousands of G's for a very short duration. Vibration will also happen while driving the Rover on Mars. To avoid getting noise in the images, the cameras also have to be insensitive to electrical interference from other rover systems.

PLATO

The telescope on the Kepler space mission has been sending back images for more than five years and has detected over 1000 planets circling other stars in their hunt for earth-like planets. “The interest is to find an earth-sized planet at the same distance from their star as we are from the sun. That's very challenging technically, so we have to build very powerful telescopes and instruments to achieve that,” said Dr. Paul Jorden, Astronomy Technical Specialist at Teledyne e2v.

PLATO sensor shown without protective metalwork. (Photo courtesy of Teledyne e2v)

The 26 telescopes on the PLATO mission will be able to cover a much wider area of the sky than Kepler and collect a much larger — 2-Gigapixel — dataset, with far greater sensitivity. Planets are discovered by sensing a dimming of the light from the star as the planet orbits. The light is dimmed by only one part in 105, which makes sensitivity a crucial parameter.

Sensors for space imaging differ from the general industry trends for smaller size and lower cost. For space, you want the sensor to have a large area to capture a large viewing angle with maximum sensitivity — large focal planes, large areas of silicon. The other difference is in performance — you want high sensitivity, and small chips don't always give you that. You also want parameters such as a high dynamic range and linearity — being able to measure a small signal at the same time as a large one. Some of these factors aren't so crucial for commercial components because you can make corrections in the software or hide issues in the way in which you display the data.

The other significant difference is the extreme amount of environmental testing for space-borne cameras. In addition to temperature, shock, and vibration, radiation is a serious problem. Various performance parameters are influenced by radiation. One is dark current, where you get excess charge in the device. For that, you can modify the design so it's less susceptible to damage. Although you can't remove it completely, you can reduce the effects. When a device is irradiated in space it generally introduces a lot of places where the image — the charge on the sensor — can be trapped, which deteriorates performance. You can overcome that to some degree by artificially introducing signals.

There is a synergy between space-based cameras and those in ground-based observatories, both in form and function. According to Jorden, the basic technology of the sensors is the same. In fact, Teledyne e2v often develops designs for earth telescopes, and then modifies them for travel into space. As for function, the space telescopes can pinpoint the location of an exoplanet and that information is sent as guidance to observatories on earth. “When you detect planets from the spacecraft, you can detect thousands of them but usually you have to make more detailed measurements from bigger telescopes on the ground to confirm and learn more details about those planets. So we do both — we make the ‘detection engines’ that are in space and then we make the follow-up sensors that go on larger spectro-scopic instruments on the ground,” said Jorden.

As to trends in space-based cameras, Jorden points to making the sensors larger. A decade ago it would typically be about 16 megapixels, “now we are able to make up to a 91 megapixel sensor,” he said. Another significant change is the move from CCD to CMOS. It's only in the last few years that CMOS has approached the quality you can get with CCD. CMOS is intrinsically more hardened against radiation and dissipates a lot less power.

SPIDER

Lockheed Martin has taken on the challenge of producing an imaging sensor with a wide surface area, while reducing size, weight, and power consumption. Key to their approach is interferometry, which has been used in ground-based observatories, collecting data from the sky to stitch together high-resolution images. However, SPIDER flips that concept, viewing instead from space and imaging the earth's surface. It trades long telescopes and complex combining optics for hundreds or thousands of tiny lenses using PICs (photonic integrated circuits) to combine the light to form interference fringes and then uses algorithms to process that fringe data into recognizable images.

Engineers working on the wide surface area sensor for SPIDER. (Photo courtesy of Lockheed Martin)

Interferometric imaging helps create sharp images without the depth and mass of a conventional telescope. PIC technology and the ability to combine high numbers of interference signals are critical for the design. The SPIDER concept uses thousands of detectors densely packed onto PICs to measure amplitude and phase at frequencies that span the full synthetic aperture. That provides an increase in pixel count while maintaining a thin disk.

SPIDER's PICs do not require complex, precision alignment of large lenses and mirrors. That means less risk in orbit. And its many lenses can be rearranged into different configurations, which could offer flexible placement options on its host. While telescopes have commonly been cylindrical, SPIDER could be used for different thin disk shapes, from squares to hexagons and even conformal concepts.

The Family of Space Imagers

Although there are unique issues in space imaging, depending on the particular application, there are also common challenges. For any space-based imager, small size and mass, peak energy efficiency, and environmental hardening are key. Maximizing sensor area and sensitivity are goals regardless of whether the imaging is from space or earth.

Artist's rendition of SPIDER in space. (Image courtesy of Lockheed Martin)

Also, the three projects described here are long-term. Each is expected to only be finalized some time in the next decade.

Perhaps the most important quality of space imaging missions can be summed up in the word's of Mastcam's Jim Bell:

“I think the most exciting thing about this mission is this will be the first time, and I've been involved in a number of Mars missions, we'll be operating the cameras to take pictures of soils and rocks and sand dunes and distant hills and sky in places where we'll be able to put the arm down and drill into and collect samples. For the first time, I might actually get to see those samples with my own eyes, because the intention is that a future mission in the 2020s will bring them back to the earth. Being able to take a picture now and say to myself – if I stay healthy, exercise well – I might get to see these actual samples in the lab someday with my own eyes. That'll be pretty cool!”

This article was written by Ed Brown, Associate Editor of Photonics & Imaging Technology. For more information, Click Here .