Dr. Murzy Jhabvala

Visible light is only one narrow band of the electromagnetic spectrum, and doesn't always tell scientists what they need to know. Infrared, which is outside the range of human eyesight, has for years been used to delve out mysteries of distant stars or to allow users to see in the dark. NASA scientists have now improved the Quantum Well Infrared Photodetector (QWIP) array infrared technology to gain more detail than ever before. NASA engineer Dr. Murzy Jhabvala led the project.

NASA Tech Briefs:

What is the Quantum Well Infrared Photodetector Array, and how does it work?

Murzy Jhabvala: It's a gallium arsenide based structure. And gallium arsenide, in terms of technology and using it in fabricating devices, is very similar to silicon technology, which is used to make all the computer chips. So because we are using a technology that is comparable to that makes it very versatile, very easy to work with, because we don't have to invent new equipment to deal with it.

But it is a gallium-arsenide based technology. With the Quantum Well Infrared Photodetector, or QWIP, the quantum well part of that, what we do is we start with a gallium-arsenide wafer, which looks very similar to a blank silicon wafer, and on it we grow alternating layers of different materials. In our case, it's alternating layers of aluminum gallium-arsenide, gallium-arsenide, and indium gallium-arsenide. Each layer is on the order of forty or fifty atoms thick. And we alternate them and we grow them about 106 periods of this. So in our case, we had a bottom layer of aluminum gallium-arsenide, then another layer of gallium-arsenide, then a layer of indium gallium-arsenide, then a top layer of gallium arsenide. Those four layers were repeated over 100 times, and each of them are about 40 to 50 atoms thick.

So we create this structure on the surface of the gallium-arsenide, and this is the quantum well part of the device. Once the wafer is constructed, then we pattern it into discrete detector elements. And in our case, we patterned it into arrays that were 1,000 by 1,000 square array, with each element being 25 microns in size. And when you have 1,000 by 1,000, the total number is going to be a million. So we make this one million pixel array.

And what is unique about this array is that QWIPS, by virtue of their structure, tend to want to be responsive to a specific wavelength, a specific narrow band wavelength, because of their quantum nature. And what we did here that is somewhat unique, is we were able to make the QWIP responsive to a broad band of wavelengths; in our case, 8 to 12 microns. And that's important in terms of spectroscopy, were you want to work with an object and see its characteristics at a variety of wavelengths, not just one wavelength. That's where the information is, in the spectral content of the image that you are looking at.

And QWIPS, like many infrared detectors, have to be cooled, and in our case, these were cooled to about 60 kelvin, which is quite cold. That would about -231 C. But infrared detectors, particularly sensitive infrared detectors, have to be cooled. There are a number of reasons; some just won't work until they get to a certain temperature. And they tend to make their own signals when they are warm, or warmer, and these signals will overwhelm any incoming signal, and you can't discriminate the signal your looking at and the signal the detector is making. It's "noisy." So you have to keep cooling it until you make the detector signal go away. And so we ran ours around 60 kelvin, and what is new for us is that we were able to capture imagery in video format. We have videos! And further, many detectors which are not silicon-based, of which there are quite a few in the infrared spectrum-there's mercury-cadmium-telluride and indium antimonide and a whole range-all those detectors convert incoming signal in the infrared into electrons. And that is what the detector does. But you have to take those electrons and get them out of the detector. And in order to do that, the detector has to be mated, physically mated, to a read-out chip. And the read-out chip has the same exact footprint as the QWIP detector array. So if you have a million pixels, each 25 microns square, you need a read-out chip with a million cells, with about three or four transistors in them, each 25 microns square as well, and then you physically mate the detector to this read-out chip and the electrons are extracted once the incoming photons hit the detector array, they are converted to electrons, those electrons are extracted.

So what the array does is convert infrared radiation, photons, into electrons. And silicon chips can all understand electrons. They have no understanding of photons. We take those electrons and convert them into voltage, and the voltage that we get is proportional to the incoming radiation, the quantity of the photons that came in.

NTB: How does this model differ from other infrared sensors or past QWIP arrays?

Jhabvala: For past QWIP arrays, I think, this is the largest format. It has the largest number of pixels, in our case, a million, that have been made in this broad band, in this 8 to 12 micron region. We don't think anybody has done that. That is the first difference.

The second is, we are not sure if there are competing technologies. Can't say for sure one way or the other, but my search shows that there are not other technologies. For example, MerCad Telluride, indium antimonide, and a whole range of silicon-based detectors that respond to the infrared, but we're not aware of any of them that have been made in a 1 K-by-1 K 8 to 12 micron responsivity format. So that is how it differs from other technologies.

It differs from what we have done in the past by being the first time we've had a broad-band detector. We have made a 1 K-by-1 K narrow-band detector on the way to doing this. It was responsive from about 8 to 8.5 microns. Very narrow range. So you can only get one spectral point, basically, if you are looking at an object. This one, 8 to 12 microns, if you put on the appropriate filters, you can get many different spectral points of information on the same object with the same detector.

If you are looking at a star, a bright star, and you're looking at it at 8 to 8.4 microns, you'll see a nice picture of what that star looks like, a photograph. But that is all you'll be able to see. You won't be able to tell all the different compounds in that star, all the different molecules. You don't have enough spectral information, you aren't looking across enough of the spectrum. So it'd make a very pretty picture in that one wavelength, but if you wanted to know all the different compounds, all the different atoms, you'd have to look at it from 8 to 8.1, then 8.1 to 8.2, and then, say, 8.2 to 8.3 and then, what do you see? What do you see from 8.3 to 8.8? What do you see from 8 to 9? And so on and so forth, all the way up you'd be able to get a whole lot of different information because you are looking at a broad spectral region. That's the spectroscopy of it. And that is where the information is. It'll still make a nice photograph, but what you really want is to understand what the constituents are of that star.

If you want to look at an object on the Earth and try to ascertain it's temperature, you need at least two different wavelengths to do that, because you have two variables when you are trying to determine temperature of an object. So in this sense, this QWIP is far more versatile than the single-wavelength QWIPs.

NTB: What applications does infrared have?

Jhabvala: Any object in the entire Universe will emit radiation as the function of the temperature of that body. The hotter the body, the more radiation it will emit. Your eyeball is sensitive to only a very narrow portion of the spectrum, the visible spectrum. Objects have to get pretty darn hot before you can see it with your eyeball. You stick a poker in the fire-until that poker reaches about 900° C, it looks black. And then it starts to turn red. That color red that you see is the visible radiation coming off the poker because it is that hot.

But most objects aren't that hot. Now, that poker, when it is only 400° C, will be very hot to your finger, but to your eyeball, it looks just as if it were ice-cold. You can't tell the difference. Detectors, infrared detectors, can. They can tell the difference. It is because of that characteristic, that objects as a function of their temperature, will emit more and more radiation, infrared radiation, and then visible radiation, and then if you get is so hot, like the sun or even hotter, you get x-ray radiation. Infrared sensors are particularly sensitive to objects in what we call the thermal region, which is room temperature and a little hotter, they can tell the temperature of things around that your eyeball cannot.

NTB: How has NASA used QWIP technology in the past, and how would this new version be used?

Jhabvala: QWIP technology is still an emerging technology, in NASA terms. NASA wants use technologies that are proven, that have gone through a technology-readiness level, that where all the risks have been retired. In that sense, NASA is very conservative, and rightly so. You don't want to be taking risks on a one-shot, 1 billion-dollar satellite. You want everything to be proven and a done deal. QWIPS, as far as I know, has not flown in any NASA flight mission.

We have used them in a number-and this is all part of the path to getting then into flight missions-we have flown them in a number of Earth-observing aircraft. And we've flown as part of a consortium that was called SAFARI 2000, which was a large collaboration between NASA and other institutions to do environmental monitoring in southern Africa. We are currently, just recently finished, flying a QWIP camera, the 8.5 narrow-band one, in a collaboration called BASE-ASIA, which is another consortium to do environmental monitoring in southeast Asia, Thailand mostly.

We have flown them on a number of aircraft to look around this area, Greenbelt; we've flown them in Monterrey, CA, to do terrestrial observations. And so we're trying to do our part to get them pretty matriculated into the whole NASA technology-readiness program. But at this point, NASA hasn't selected them for any missions. They have very recently had some very good proposals submitted; not by us, but by other organizations, very competitive proposals for NASA missions, but I don't think any have been selected.

NTB: What earthbound applications could QWIPS have?

Jhabvala: I keep a running list! I just got an inquiry from the National Geological Survey. There's the semi-obvious "earth science stuff," which is studying the troposphere and stratosphere temperature and identifying trace chemicals-hopefully there aren't many, but they are out there. Studying the tree-canopy energy balance, measure cloud layers, in emissivities-droplet-particle sizes, composition, and height-a whole range of pollutant measurements, like from volcanic eruptions. We track dust particles-we could, but we haven't yet, that's an application QWIPS could be used for; oddly enough, dust storms from the Sahara Desert deposits dust in New York. We monitor that, as well as CO2 absorption, coastal erosion, we can look at ocean/river thermal gradients-like I said, these are very sensitive to measuring temperatures around room temperature. We can do ground-based astronomy, temperature sounding, which is measuring temperature as a function of distance or height.

And then there are applications different from earth science. One is medical diagnosis. There is a company in New York who is actually using a QWIP as part of a system they built to determine whether tumors are benign or cancerous. So there are medical applications, and I would like to pursue those. What we did in Africa is very easy, which was to locate forest fires, and then after the fire goes out, locate hotspots your eyes can't identify.

I keep a list, in different journals, a list of things people would like to measure but don't know how, like locations of unwanted vegetation. This alien vegetation is a big problem in the world now. People bring in prickly-pear cactus and the next thing you know, that's all that's growing. Or kudzu. There is a whole range of things that are growing where they shouldn't be that choke out the local vegetation.

Then there is monitoring crop health. By scanning the crops in the infrared and then looking at their spectral fingerprints, we can get a handle on whether they have too much water, too little water, maybe they're diseased or healthy. One of the interesting things we have discovered, we were invited to Brazil to present on this: one of the problems they have, they have these power lines that run across the Amazon, and that's many hundreds to thousands miles of power lines in this dense forest. And their transformers routinely go bad. And so, right now, they have to send someone to find this dead transformer to replace it. If the transformer is heating up, it would be very easy to spot it with a QWIP camera.

We could monitor effluents from industrial operations like paper mills and mining operation and power plants that suck in river water for cooling and then spit it out. Sometimes they spit it out and it's too hot, and will harm the river and the life in the river.

A scientist here at Goddard, Friedeman Freud, thinks that prior to earthquakes occurring, the enormous pressures that are generated in the Earth's crust, prior to the actual quake, will emit an infrared photon that we can detect with a QWIP camera. So if, all of a sudden, we see a spike in this signal, it could indicate an earthquake is about to occur. This is still a theory; they are still trying to prove it. They believe there are physics to support it.

Then there is monitoring food spoilage, ripeness, and in particular, contamination-various diseases that are showing up in the food chain, Mad Cow and what have you. It might be possible to use the QWIP to identify any contaminants or any thing that is abnormal in selected food samples.

The US Geological Service would like to use the QWIP to locate caves. Particularly, on Mars. But before they do that, they would like to try to see if they can locate caves, on Earth. It wasn't clear to me on how they do this, but as it turns out, caves, during the day, they tend to be at equilibrium temperature. And then once it gets cold at night, the caves stay at their temperature, say, at 70° F, and if it gets cold at night, say 50° F, the mouth of the cave will be this big, bright hotspot. And so, it hadn't occurred to me, but they said they should be able to fly a helicopter over places like Arizona or New Mexico and try to find where caves are. And if they can, they would like to propose this as a way to find caves on Mars. They were particularly interested because of the discovery of a cave ecosystem, a pretty deep one, which was recently discovered in Israel. The Survey thinks that this could be occurring on Mars and we wouldn't even know it. Again, this is just a potential application.

What I am really pushing for are medical applications. I think there is a lot of potential here. The problem between the whole technology and medicine fields is that we speak two different languages. And they have lots of problems, and we have lots of solutions-and the two never seem to connect up right. The doctors don't know what infrared technology is, and we really don't know what they're looking for. If you've been following x-ray mammography, for years, if they want to do a mammogram, they still use x-ray film. But the CCD cameras, the things you use in your camcorders, can do an equally good job. The problem is, the professionals are so trained with these films, that when they look at these other types of images, they have to learn all over again. And so there is some reticence on the part of the medical profession to embrace these new, advanced technologies, and it is hard for the engineers to put it in terms so that doctors don't have to get an engineering degree to use it. But I think the problems are there, and the solutions are there, and we just have to figure out a way for the two to meet.

For more information, contact Dr. Murzy Jhabvala at This email address is being protected from spambots. You need JavaScript enabled to view it. .

NASA Tech Briefs Magazine

This article first appeared in the August, 2006 issue of NASA Tech Briefs Magazine.

Read more articles from the archives here.