In machine vision systems, acquiring images of moving targets is a challenge and consequently the best image requires three fundamentals to be well defined:

  • an excellent camera;
  • an appropriate lens;
  • an appropriate illumination.

All three items are the key to enable subsequent successful image analysis, while poor image quality will task any machine vision application.

Figure 1. Signal to noise ratio (SNR) related to illuminance.
The illumination choice is the first step that directly affects the quality of the images. There are many difficulties that can result from the wrong illumination selection. In general what is not illuminated correctly cannot be evaluated by software, or even by humans. A camera can also not be compared to the human eye that adapts automatically and is very flexible to difficult tasks. A human would even change the distance or the angle of view to discover details. For a machine vision camera it is not possible to detect edges if the object is not illuminated correctly.

In today’s vision applications, different tasks have to be solved. Metallic shiny, matt dark, or even transparent surfaces with different features require different types of illumination which led to the development of different illumination technologies. Many aspects affect the choice of correct illumination and have to be considered:

  • area to be illuminated;
  • camera in use;
  • speed of the application and the camera itself;
  • color of the illuminated objects;
  • environment;
  • behavior and characteristics of the object (glossy, diffuse, height variations,…);
  • expected / required lifetime of the application.

If all of these parameters are understood, the application can be simulated and it can be determined which illumination fits best.


Figure 2. Example of SNR resulting from shot noise // 20 μs integration time, 200 DPI resolution, diffuse reflection with 80%.
Many people are not aware of what the ingredients are for a good machine vision application. In line-scan applications the light should be where the sensor array(s) of the camera are focused. Light outside this area is wasted and results in extra costs and heat.

Figure 3. Spectral changes of an LED over temperature, 55°C as reference.
In all imaging technologies, one important quality criteria is noise. There are several sources of noise in an imaging system but normally, the shot noise dominates. Shot noise is caused by a physical effect and has nothing to do with camera quality. The reason for shot noise is found in the discrete nature of the light (photons) and the resulting discrete generation of electrons in a sensor pixel.

Shot noise has a Poisson distribution and therefore, the signal to noise ratio can be described as: SNR = √Ne

The number of electrons is directly proportional to the number of photons. The number of photons is directly proportional to the product of sensor illuminance and exposure time. In a given imaging setup with a defined optical transformation there are three parameters that influence the shot noise in an image:

  • integration time (scanning speed);
  • f-stop (depth of focus and maximum sharpness);
  • illuminance on the scanned object.

The f-stop of a lens has a significant impact on the requirements for light. For example, changing the f-stop from 4 to 5.6 increases the light requirement by a factor of two if trying to keep the same signal-to-noise ratio. At the same time it increases the depth of focus and improves the optical quality with most lenses. So the depth of focus and sharpness increase while vignetting effects are reduced. What machine vision application wouldn’t benefit from having a sharper image and an increased depth of field?