The colonel has a problem. He has eight unmanned aerial vehicles (UAVs) flying over the outskirts of Baghdad, looking for potential insurgent activity. Are those people he sees moving through the streets insurgents or are they a US infantry patrol? If he alters the flight path of one UAV to loiter over the suspicious activity, how should he deploy the other seven UAVs to pick up the first's original mission? How does he keep track of the positions of the eight UAVs, what they are seeing, and the locations of nearby US troops, all in real-time?

One potential solution is being developed at Iowa State University's Virtual Reality Applications Center (VRAC). Earlier this year, the center opened its newly redesigned C6 virtual reality environment, a six-sided room (four walls, floor, and ceiling) capable of generating a world-leading 100-million pixels to create a hyper-realistic, immersive 3D experience. The C6 environment was engineered by Mechdyne Corporation ( www.mechdyne.com  ), a leading developer of advanced visual display systems and software, and uses NVIDIA Quadro® technology ( www.nividia.com/quadro  ) for the graphics horsepower.

Conceptual view of Iowa State University’s C6 Virtual Reality Applications Center.

"The C6 is a completely immersive 3D display, with sound, for a total sensory experience," says Kurt Hoffmeister, vice president for R&D at Mechdyne. "It really puts you in the middle of whatever is being modeled."

The C6 is more than 16 times the resolution of a typical immersive room and more than double the resolution of its five-sided, 43-million pixel predecessor, which was also created by Mechdyne.

Not only is the C6 environment applicable to Air Force battlespace management, but it can be used for applications as diverse as urban planning and genomic research. Even Google Earth flythroughs can be improved and enhanced with the technology being developed and tested at VRAC. The C6 environment can be used for traditional virtual reality applications like architectural walkthroughs, as well as for data analysis, displaying multi-variant information in 3D and enabling researchers to perceive patterns that would not be recognized through traditional numerical analysis.

The technology behind the C6 is truly leading edge. The images on the six walls are created by 24 Sony SXRD projectors, each capable of generating 4096 × 2160 native resolution. A pair of projectors is stacked to fill each of the six walls with a 4096 × 4096 image. Each pair is then duplicated to create left/right eye stereo versions of the image for each wall. The system can generate a total of 201,326,592 pixels, or 100,663,296 pixels for each eye.

The Iowa State University C6 image generator incorporates 96 NVIDIA Quadro professional GPUs in an HP computer cluster.

The stereo images cycle between right and left eyes at 80 times per second (160 Hz active stereo). 3D shutter glasses operate in synch with the projectors to fuse the two views into a single 3D image in the brains of the viewers.

All this is powered by what may very well be the world's largest graphics computing cluster — 48 rack-mounted HP xw9300 workstations plus one control node that runs the application and interactivity software. Each workstation contains two NVIDIA Quadro FX professional graphics cards along with an NVIDIA Quadro Gsync card to synchronize the displays and stereo phase of all outputs. In total the cluster provides a total of 96 channels of graphics output, 48 per eye.

"Our biggest challenge in designing and developing the system was in gen-locking and swap-synching," says Hoffmeister. "The number of nodes is simply unprecedented. I don't know of anyone else who has attempted such a complex graphics system. The key is the Quadro graphics combined with the Gsync technology"

The graphics cluster is connected to the projectors by multi-mode, fiber-optic, DVI cables for noise-free data transfer. The entire C6 system uses a total of 3.2 miles (5.15 km) of fiber-optic cable.

The C6 room itself can surround up to seven viewers with high- resolution imagery, above and below as well as on all sides. The floor consists of a three- inch-thick acrylic slab with a projection screen. The screen is then covered with an additional thin protective layer of acrylic to protect the screen from the viewers' feet. This completely surrounds the viewers with imagery.

In the battlespace management application, which is sponsored by the US Air Force Office of Scientific Research, the C6 system allows one commander to remotely operate up to eight UAVs at once, visualizing terrain, other aircraft, and both friendly and hostile forces. Existing virtual reality systems allowed some of these capabilities, but not down to the level of detail needed by commanders on the battlefield. Now they can see details down to individual soldiers.

Another military application, sponsored by the National Guard, is a traditional virtual reality function, where soldiers can use the technology in the C6 to conduct virtual terrain walks of a potential battlefield in advance of a patrol or deployment—sort of an "architectural walkthrough" for the military.

But the military and traditional virtual reality applications are just scratching the surface of the C6 environment's potential. Perhaps the most innovative and valuable applications are those that use the 3D capabilities of the system to visualize data. The human brain is very visually oriented; often we can best understand information that we can see. By immersing themselves in 3D representations of their data, researchers can often, quite literally, see relationships that might pass unnoticed with traditional analytical tools.

One such research application currently under development, Meta!Blast, is used to study the structure and function of cells of soybeans and other plants. The research, sponsored by the National Science Foundation, seeks to visualize what would actually happen within an individual cell as the researcher or student makes molecular changes to that cell.

Other applications include urban planning, to visualize the effects of growth and increase the efficiency of various designs and plans. Mechdyne has also ported a number of commercial applications for use in the C6 chamber. You can conduct a fly-through of the Grand Canyon using Google Earth on your home PC, but it takes on a completely different level of realism in the C6 environment.

Of course much of what enables this system are the human and organizational elements. Much of the "vision" behind the project comes from the fact that VRAC is a multi-disciplined research center, and not a lab focused only on computer science engineering. VRAC projects like C6 are not simply leading-edge engineering; they are also platforms for cutting edge research.

C6 and other VRAC efforts are designed to solve specific problems for government, industry, and academia. This colors the projects with a degree of practicality that is often absent in engineering research that is not geared toward specific real-world goals and, hopefully, will fast-track the research results into practicable, deployable solutions.

"One hundred million pixels is a tremendous achievement," says Hoffmeister. "But what makes C6 really compelling is that we're solving real problems and enabling discoveries that better people's lives."

This article was written by David Wilton, Technology Writer, NVIDIA (Santa Clara, CA). For more information, contact Mr. Wilton at This email address is being protected from spambots. You need JavaScript enabled to view it., or click here  .



Magazine cover
Embedded Technology Magazine

This article first appeared in the May, 2008 issue of Embedded Technology Magazine (Vol. 32 No. 5).

Read more articles from this issue here.

Read more articles from the archives here.