Tiny Devices Project Sharp, Colorful Images Consumer, Home, and Recreation
- Created on Sunday, 01 November 2009
Originating Technology/NASA Contribution
Johnson Space Center, NASA’s center for the design of systems for human space flight, began developing high-resolution visual displays in the 1990s for telepresence, which uses virtual reality technology to immerse an operator into the environment of a robot in another location. Telepresence is used by several industries when virtual immersion in an environment is a safer option, including remote training exercises and virtual prototyping, as well as remote monitoring of hazardous environments. Microdisplay panels, the tiny screens that comprise the visual displays for telepresence, are also used in some electronic viewfinders for digital video and still cameras.
In 1993, Johnson Space Center granted a Small Business Innovation Research (SBIR) contract to Displaytech Inc., based in Longmont, Colorado, and recently acquired by Micron Technology Inc., of Boise, Idaho. Under Phase I of this contract, Displaytech began developing miniature high-resolution displays based on its ferroelectric liquid-crystal-on-silicon (FLCOS) technology. Displaytech proposed that pixels could be made small enough to fit a complete high-resolution panel onto a single integrated circuit.
Displaytech first determined how to make a panel that could reproduce grayscale using only standard complementary metal-oxide-semiconductor (CMOS) logic circuitry, which just recognizes binary values (such as a “0” for black and a “1” for white) and was not well suited for subtle shades of gray. Dr. Mark Handschy, Displaytech’s chief technology officer, explains the company perfected time-based grayscale techniques in a Phase II follow-on NASA contract: “Because our ferroelectric liquid crystal material can switch faster than the eye can follow, a sequence of displayed black and white images is averaged by the eye into a single grayscale image.”
For FLCOS panels to work well, Handschy explains, they need a smooth and shiny wafer top surface. Without this, the pixel mirrors form in the last metal layer on the semiconductor wafer, scatter, and then absorb light, resulting in a dim appearance. “The Phase II of our NASA SBIR came at a very opportune time,” Handschy says. “We were able to have an SXGA [super-extended video graphics array] CMOS backplane we’d designed under the NASA project using one of the first commercially available CMP silicon processes.” Chemical mechanical planarization (CMP) is a special technique of polishing semiconductor wafers to allow more metal layers—and smoother integrated-circuit surfaces—and was one of the factors that led to Displaytech’s success.
Another important development during the mid-1990s was the introduction of efficient blue light-emitting diodes (LEDs). Displaytech took these bright blue LEDs and combined them with red and green LEDs to illuminate its panels, rapidly sequencing through the color LEDs to create the illusion of different hues as they reflect off the panels. “In this SBIR program, we developed grayscale and color for microdisplay panels,” Handschy says, “And that was a first for us. We’ve since leveraged that into a line of products.”