2009

Making the Move to Digital in Machine Vision

Analog cameras dominated the early years of machine vision systems, offering adequate performance, a simple interface, and a moderate price. Technology advances, however, are now tipping the scales in favor of digital cameras for most new and many legacy applications. Dropping prices, standardized interfaces, and opportunities for customized preprocessing are making the analog to digital transition painless and profitable.

altIn the earliest years of machine vision systems, the only video cameras available were those developed for television. These early cameras produced an analog signal at a fixed 30 frames per second with limited resolution. They were neither intended for direct connection to a computer nor for use in a control loop of any kind. To utilize them, machine vision systems needed to incorporate an integrated digitizer and frame grabber to convert and store the video information for processing.

The structure of an image processing system that uses an analog camera has three elements, as shown in Figure 1. The camera provides a simple analog signal, typically conforming to the RS-170 standard, carried on a conventional coaxial cable to the frame grabber. The frame grabber uses an internal digitizer to convert the analog signal to pixels and stores the data in memory. An image processing element, typically a PC, takes data from the frame grabber for processing and display. Because the frame grabber and the image processor are independent system elements, their programming is not automatically coordinated.

Using an external digitizer with an analog camera creates some side effects that complicate image processing. One is ambiguity in the relationship between the physical location that a digital sample represents and the corresponding pixel’s location in the digital image. The digitizer’s sample clock and the camera’s line signal sweep must be coordinated and repeatable for the resulting pixels to produce a spatially correct image. Synchronization errors, as well as timing jitter in the sampling clock, will result in image pixels that are offset from their true location (see Figure 2).

Another side effect of external digitization is that the horizontal and vertical resolution of the image can differ. The analog camera’s line rate determines the image’s vertical resolution and the digitizer’s sample rate determines the horizontal resolution. Without careful matching of the digitizer to the line rate, the image pixels will not represent the square area samples that image processing algorithms assume. Matching to achieve square pixels, however, locks the system data rate to the camera’s line resolution.

Digital cameras behave quite differently. Each light-gathering region on a digital sensor receives independent digitization that does not depend on clock timing, so synchronization is not needed and timing jitter does not introduce spatial distortion. This timing independence means that sensor physical design alone determines both horizontal and vertical resolution, so image pixels are inherently square. Further, the clocking speed for digital camera image transfers becomes, essentially, independent of the image resolution. The only clocking requirement is that the system’s pixel clocking rate must be fast enough to transfer the entire image within the frame time. Even that is not a hard and fast rule. Digital cameras can be configured to transfer out only an area of interest within the image, reducing the requirements on the pixel clock.

Digital Interfaces Simplify

Because the data coming from the camera is digital, the interface to the rest of the machine vision system is somewhat more complex than for analog cameras. Early digital camera designs used proprietary, high-speed interfaces with low-voltage differential signaling (LVDS). This required large, bulky, and expensive cables that could only run for a limited distance before connecting to the frame grabber or processor. Further, because the camera interfaces were proprietary, system developers needed to ensure that the frame grabber or processor interface they used would match the camera’s interface. In practice, this often meant obtaining both elements from the same manufacturer to ensure compatibility.

White Papers

The Basics Of Pressure Regulators
Sponsored by Beswick
How Lean Manufacturing Adds Value to PCB Production
Sponsored by Sunstone Circuits
Electrical and Mechanical Integration in Aerospace Design
Sponsored by Mentor Graphics
Envelope Tracking and Digital Pre-Distortion Test Solution for RF Amplifiers
Sponsored by Rohde and Schwarz A and D
PICO xMOD Data Sheet
Sponsored by Nordson EFD
Manager’s Guide to Productivity Gains With Multiphysics Simulation
Sponsored by COMSOL

White Papers Sponsored By: