Machine vision systems are playing an increasingly important role in many industrial applications, whether it is counting parts on an assembly line or examining surfaces for defects. Improvements in computing power, optics, connectivity, and software are allowing vision systems to be deployed in a wider range of applications.

Vision systems using cameras such as SICK’s IVC-3D inspect intricate surfaces such as brake pads.
Although machine vision systems have been around for several decades, their potential is not fully realized, according to Amir Novini, president and founder of Applied Vision Corp. (Akron, OH). “Machine vision remains one of the best-kept secrets - a lot of people still don’t know about them. “We’re going to see machine vision in everyday products and systems. They will become adaptable, more sophisticated. We’re working on coupling artificial intelligence to machine vision to be more forgiving and adaptive.”

Counting parts on an assembly line is a common machine vision application. But as lighting systems improve, machine vision systems are now used to search for cracks and scratches in materials in low intensity structures, according to Ben Dawson, director of strategic development at DALSA Industrial Products (Billerica, MA).

High-speed food and beverage processing operations are also increasing their use of machine vision, noted Novini. In such applications, the vision system is often required to keep up with count rates of 600 to several thousand objects per minute.

Implementing a vision system remains a somewhat daunting task, often requiring the expertise of a systems integrator. A fair amount of set-up and programming is required to train the systems to perform specific tasks. “It is hard to put together a flexible system to do more than one thing that it is assigned to, said Karl Gunnarsson, vision manager of SICK Inc. (Minneapolis, MN). “There still has to be known parts — the same parts should appear again and again. A lot of times the goal part needs certain features such as holes or other surface features.”

Nevertheless, machine vision is making impressive strides in areas such as software. “Previously, vision algorithms could only analyze two-dimensional planar surfaces. They are now looking at 3-D surfaces,” said DALSA’s Dawson.

Vision software is also becoming more flexible, with the ability to be deployed for different systems, according to John Agapakis, Machine Vision Business Manager Siemens Energy and Automation (Alpharetta, GA). “We offer the Simatic Visionscape image processing software, which allows programming for either PC-based or Smart Camera-based vision environments. You can use the same software to program different vision systems.”

Visionscape allows simultaneous viewing of several camera pictures. It can also be used for linescan applications such as checking labels on cylindrical objects like bottles.

Software advances have been accompanied by more powerful hardware. “The most noticeable difference is the speed of the computing hardware, which is several orders of magnitude faster than years ago,” said Novini.

“The progress made to provide high-performance, low-cost components has allowed broadening the scope of machine vision in more applications,” added Stephane Francois, executive vice president of Leutron Vision Inc. a Swiss-based supplier of machine vision cameras and hardware.

Some processing power now resides not on a separate vision processing board but inside the vision camera itself. No longer mere image capture devices, vision cameras are scaling up the technology curve by adding intelligence and processing capability. The enhanced cameras are sometimes referred to as “smart cameras”.

Siemens teams its SIMATIC Visionscape image processing software with its HawkEye 1600T Smart Camera
“Algorithms that used to require lots of processing power now can be done on the smart camera itself,” said Agapakis. One of Siemens’ smart cameras, the SIMATIC HawkEye 1600T, combines image capture, image processing and analysis, and communications into a compact housing. These cameras are suited for applications where several inspection tasks must be performed in a single test cycle.

The smart cameras can ease cabling. “With a smart camera, the cable connection becomes lighter and simpler,” said Agapakis. “Previously, you needed a heavy cable with lots of small conductors to get the signal off the robot arm to the processor.”

Joe Christenson, president and CEO of PPT Vision Inc. (Eden Prairie, MN), added, “The interconnectivity and networking features in most of today’s smart cameras make it easier to implement a multi-camera solution in motion control applications. In addition to the traditional RS232/485 and discrete I/O data transfer, most smart camera technology today is designed to support a wide range of communication protocols, such as TCP/IP, Modbus, Device Net, and OPC. This makes sharing information among different cameras and between the cameras and the host computer fast and effective — a must-have for lots of motion control applications such as robot guidance and multi-axis pick-and-place arm control.”

Better vision sensors are partially responsible for vision camera improvements. “The tools have gotten a lot better. Sensors costing less than $1,500 can do what high-end sensors did a few years ago,” said SICK’s Gunnarsson. “You can (now) do pattern match in any field of view with cameras costing as low as $1,000.”

CCD sensors have been the predominant technology in machine vision cameras because of their resolution. But companies are taking a closer look at lower-cost CMOS sensors, according to Applied Vision’s Novini, thanks to improvements in power consumption and dynamic range.

Machine vision lighting systems are also benefitting from improvements in light-emitting diode (LED) technology that are enabling the vision system to detect objects or patterns in remote or hidden areas. LEDs are now stable over time, have a long life, can be turned on and off rapidly, and controlled accurately.

“LEDs have become the standard for lighting,” said Siemens’ Agapakis. “Brighter LEDS can illuminate a larger area from a long distance.”

Further integration of vision system components and subsystems will likely be dictated by the laws of physics, Agapakis added. “There’s a push to make vision hardware smaller, developing cameras with lenses and the processor built in. The processor makers have been able to drive down power consumption. The main issue is dissipating the heat generated.”

Getting the signals from the machine vision system to a robot or other controller often means connecting bulky cables. “One of the most critical aspects of success of integrating machine vision with motion control lies in cables,” said Leutron’s Francois. “A lot of information has to go through these wires and must do so without being stressed mechanically in an environment full of electromagnetic interference.”

A plethora of standards exist, including Firewire, Camera Link, and Gigabit Ethernet. Firewire, originally designed for desktop video applications, cannot keep up with the rigors of machine vision apps where linking of multiple cameras is required, according to DALSA’s Dawson. Although Camera Link remains a viable standard, many machine vision companies are adapting the GigE Vision Standard, based on the Gigabit Ethernet protocol. It uses standard Gigabit Ethernet hardware and low-cost CAT5e or CAT6 cables.

“FLIR supports the GigE Vision standard and networked infrared vision systems,” added Jason Styron, business development manager of automation for vision camera supplier FLIR Systems (North Billerica, MA). “These features ensure investment protection for organizations committed to advancing manufacturing productivity.”

Hardware and software improvements have improved the price-performance ratio of machine vision, making it more affordable to motion control and other systems integrators.

“The price of machine vision systems have come down significantly where as it is no longer a large cost burden to add vision to an industrial automated solution,” said PPT’s Joe Christenson. Considering the cost of robots and other motion control equipment, vision components sometimes only comprise 10 percent of the total solution cost.”