The goal of robotics and robotic-assisted surgery is to enable surgeons to perform complex, previously not-available procedures with increased precision that leads to reduced surgery and recovery times, as well as lowering risks for patients. Robotic surgery has made significant impacts in many applications including prostatectomy, nephrectomy, and hysterectomy colorectal surgery. With recent advances in technology, there are now more robotics applications in development than ever before.

To improve surgical workflow, site access, and recovery times, new innovations are appearing in all subsystems across the surgical robotic architecture. Improving image quality with accurate and consistent visualization allows surgeons to make more informed surgical decisions during a procedure. Surgical vision systems match wide field-of-view cameras with fiber optic or LED illumination components. Oftentimes in product development, however, the performance requirements and design of the illumination system are given far less time and resources than the camera.

To have a successful product, one must consider all the required sub-systems to deliver high-quality illumination. A specific example of this situation is a high-definition, 3D-laparoscope utilizing a chip on tip camera.

The 3D surgical vision system has four key subsystems:

  1. Illumination system, which brings light to the surgical target,
  2. The camera (lenses and CMOS sensor) to capture light from the tissue,
  3. Firmware to control image quality and latency, and
  4. A display system (a combination of 2D and 3D displays).

Each subsystem has its own key questions the design team should consider.

Clinical Applications

Prior to designing a robust illumination system, the design engineer must have a comprehensive understanding of the clinical team’s goals for a given surgical procedure. Often, a product manager acting as the “Voice of the Customer” will identify a predicate device and ask for the “best image quality”. The R&D team will have to translate this request into quantitative requirements, identifying imaging modalities and numerical limits on FOV, resolution, color accuracy, and image contrast as examples, eventually leading to the complete product requirements. In this article we will consider a light source for a 3D laparoscope with a camera field-of-view of 80° and a working distance of 5 to 100 mm. We will primarily consider white light applications but will discuss fluorescence considerations as well.

To elucidate this, we consider here the design of a fiber-based illumination system, with an LED light engine installed in a confined equipment housing as part of the “capital equipment,” i.e. vision tower. The capital equipment encompasses the cart that typically houses the vision and additional control systems of the surgical platform. The intended architecture of the surgical system is a rigid stereo-laparoscope for use in a robotic surgical system. To reduce risk to schedule, safety, and future user needs of integrating fluorescence or other source dependent imaging, we will consider a fiber-based solution. The authors appreciate the advances LEDs continue to make in size and efficiencies and will address the design space at the end of the article.

Illumination Considerations for Robotic Surgery

Figure 1 highlights the major system architectures of the illumination system for a robotic surgery platform. To deliver light to the scope, an illumination source – in this case, a light engine – is required. The light engine will couple light into a fiber taper, if needed, and then deliver it to the fibers that will transmit light to the tip.

The light engine is a light source that is installed in capital equipment. There are different architectures for these sources, but they can be distilled down to two primary types. Some light engines will use a single, broadband source, while others will take advantage of mixing narrow band LED’s to create a broadband source. A single broadband LED has the risk of needing to correct for blue light due to the white LED architecture that uses a blue LED to activate a phosphor. A high proportion of blue light is absorbed by red tissue. The high-blue signal in the spectra can lead to challenges in the color tuning stage and, potentially, images that look too digitized or “fake looking”. A mixed RGB LED approach can eliminate the excess blue-light issues but requires more complex optics in the light engine in order to couple the three sources into the system. If the system requires near infrared (NIR) illumination, the NIR LEDs are installed in the light engine as well, compacting the design.

By housing the RGB and NIR LEDs in the same housing, the light sources can share the same fibers that deliver the light to the tip. This maximizes the efficiency of the endoscope’s illumination system. To transfer light from the light engine to the tip of the endoscope, a high-numerical (NA) aperture fiber optics are required as well as an optical system to relay light from the source to the tip. The term to describe the angular output of a fiber is numerical aperture, or NA. The higher the NA, the higher the angular output of the fiber. The NA is equal to the sine of the highest angle that can enter and exit the fiber, The NA of the fiber is determined by the refractive index of the core and cladding of the fiber. The higher the NA, the higher the angle light will exit the fiber, illuminating a higher percentage of the field of view.

To get the best performance from the fiber optic cable, the design team must consider the relationship between the output of the light engine and the fiber optic cable. A common solution is to use a fiber taper to increase the angle of the light entering the endoscope. The fiber taper is typically installed on the proximal end of the endoscope where the light cable connects. The fiber taper converts the large-area, low-angle output of the light engine to a small-area, high-angle output.

Figure 2. A fiber taper is a common intermediate component used to alter the light at an interface between a light source and endoscopic fiber bundle. As the taper diameter decreases, the ray angles increase, resulting in a higher angle distribution at the exit than the entering angle.

The NA of the light exiting the light box is typically on the order of 0.5 NA, the angles associated with surgical robotics can reach 0.87 NA or higher. The fibers that connect to the light box should equal the exiting NA of the light box. The taper will convert the low-angle light to high-angle light to achieve the broadest illumination angle. Figure 2 shows what happens to a beam of light entering and exiting the taper.

Figure 3. To illuminate the large volumes in laparoscopic surgery, the illumination design should attempt to use as much available cross section of the device tip. Dividing the fiber bundle into sections allows for easier integration into the shaft of the device. In the example shown, there are 4 fiber optic illumination ports (represented by the white dots) and 2 visualization (detection) channels with sapphire windows.

An alternative to using a taper to reach high output angles is to design a lens to spread light out exiting the tip of the laparoscope. A lens-aided illumination system allows for higher output angles, enabling higher FOV cameras for use in the body, but comes at a cost of a less compact design.

Once light is transmitted to the fiber optics of the laparoscope, the fibers are packaged to output light across the tip, shown in Figure 3. This is more beneficial than having a single light output face for two reasons. First, it enables easier integration of the fibers into the scope, and second it prevents unwanted shadows from surgical tools impacting the image.

Operator aligning lenses for a 3D stereo endoscope.

Calibration and Testing Considerations

When designing the light source, the team must also consider the image signal pipeline (ISP) that will convert the captured image and display it on a high definition 2D and 3D monitor for the surgical team. The ISP can have various calibrations that are applied to the system, including dark signal non-uniformity on the image sensor, photo-response non-uniformity, color calibration, and white-balancing. These calibrations allow corrections that create a high-quality image; however, if the ISP relies too heavily on calibrations, the image may look highly processed and is a distraction to the surgical team.

An ISP will have blocks that require calibration of each unit. The calibration data is typically saved in memory installed on the endoscope. Starting the definition of the calibration process early and coordinating with ISP development engineers will reduce risk of late-stage development issues. By considering the ISP and calibrations early, multiple revisions of the light source and firmware are possible prior to product launch. Calibrations do have limits, and if the illumination system is designed closer to the intended surgical use, less troubleshooting of calibrations in the development process is required.

Test target being used to measure the image quality of a surgical endoscope.

Examples of calibrations related to the illumination source are photo-response non-uniformity (PRNU), white balance, and color correction. These calibrations are all limited in their effectiveness if the light-source itself has an inferior design. Reliance on calibrations to “fix” the light source design can lead to the produced image looking over-processed. Additionally, if the ISP must have memory allocated to calibrations, there is a risk of increasing the latency of the vision system, limiting robotic performance.

Finally, after the light engine, illumination, imaging optics, and camera firmware are designed, proper testing is required. Often the illumination system components require 100% inspection and calibration on the sources in capital equipment and laparoscopes as well. These tests require operating the device under test in a variety of conditions using specialized targets to measure color accuracy, uniformity, and power output. Designing a system to automate these tests reduces the risk for part-to-part and tester-to-tester variability, ensuring product standards are maintained in the field. These test stations require detailed mechanical, system, and software design to ensure they have successful deployment on manufacturing floors.

There are other considerations for endoscopy, or flexible scope applications. These devices often have more limitations on space available for illumination, may only have 2D imaging, be single-use, or other caveats not pertaining to the parameters presented in the article. For small-diameter and single-use devices, plastic fibers, LEDs in the tip and other, more-compact solutions, can enable a successful product, in which different design considerations and risk-mitigations are considered.

To summarize, the development of the illumination components for robotics surgery systems is a complex process. One should start from a complete understanding of the clinical application and build off that understanding. If one is designing a robotic system for white light and NIR applications, we recommend a design based on utilizing a high-NA fiber with a light engine installed in the capital equipment. To achieve the broadest illumination angle, the use of high-NA fibers to deliver light to the tip of the device is recommended. This is the most concise design approach that avoids an overly complex design. Other solutions can lead to gaps in functionality, resulting in a workaround design.

This article was written by Jonathan Brand, Optical Systems Engineer, and Neil Anderson, PhD, VP Sales and Marketing, Gray Optics (Portland, ME). For more information, contact Neil Anderson at This email address is being protected from spambots. You need JavaScript enabled to view it., or visit here .