In 2007, Dr. Hrayr Shahinian was looking for an engineering team to help him develop an endoscopic device suitable for brain surgery, and capable of both steering its lens and producing a three-dimensional video image. He discovered that the person he was seated next to at a social function was Charles Elachi, director of NASA’s Jet Propulsion Laboratory (JPL).
Director of the Skull Base Institute in Los Angeles, Shahinian helped to pioneer minimally invasive endoscopic brain surgery in the mid-1990s. As the industry shifted away from open-skull operations to endoscopic techniques, the risk of complications for most surgeries plummeted, as did the length of hospital stays and rehabilitation time. The change was not, however, without its drawbacks.
“It became obvious to me, even years ago when I was converting to endoscopic, that we were losing depth perception,” Shahinian said, noting that no matter how high-definition the image an endoscope may produce, it’s still flat, making it difficult for the surgeon to see how close the tumor is to potentially critical nerves or tissue behind it, for example. “I realized that 3D endoscopy is the future.”
The problem, though, was that this sort of brain surgery is carried out in exceedingly close quarters, so the device couldn’t be more than four millimeters in diameter. This ultimately ruled out the possibility of using dual lenses to create a 3D image.
NASA, too, often wants to get highquality images — whenever there are optics and 3D imaging involved, it can be adapted for planetary exploration. For example, on a rover, such a camera could peer into the opening left by a rock core extraction.
Seeing this common interest, NASA entered into a Space Act Agreement with Shahinian, and he obtained a license for the technology. After the dual-lens model was scrapped due to poor image quality at such a small lens size, the team hit on another idea: two apertures with complementary color filters, incorporated into a single lens. Just as humans and other animals process two images from the slightly different viewpoints of their eyes to perceive depth, 3D imaging requires two viewpoints.
To create a fully lit image that can be viewed with standard 3D glasses, the team relied on color filters. Each aperture filters specific wavelengths of each of the red, green, and blue spectrums that are not filtered by the other aperture. These together comprise white light. The light source, a xenon cold lamp, cycles rapidly through these six specific color wavelengths in sequence, with only half of the reflected light passing through each aperture. Two separate, fully colored images are created, and are then run through the standard software used to create and display images suitable for viewing with the same polarized 3D glasses used in movie theaters.
The other challenge to overcome was Shahinian’s specification that the end of the camera be able to steer side to side. To date, the endoscopes available to brain surgeons are all either straightlooking or fixed-angle. However, the team enabled the camera, which is controlled with a joystick, to turn 60 degrees in each direction.
In December of 2013, Shahinian had the first prototype of the Multi-Angle Rear-Viewing Endoscopic Tool (MARVEL) in his hand. Two more stereo camera prototypes have since been built. Shahinian said the improved visibility while performing minimally invasive surgery will improve safety for many types of operations, speeding patient recovery and, ultimately, reducing medical costs.