2012

Camera Trends 2012: Speed, Resolution, and Software

To find important components of an image, like an edge, Schwarzbach relies on the analysis of grayscale values. In most of today’s cameras, there are 256 shades of gray between black and white. Each shade of gray demonstrates a specific intensity of light at a given pixel, and the differences can help determine the location of an edge, a defect, or an artifact. A place to go to improve the technology, according to Schwarzbach, is to increase the number of grayscales from 8-bit to 10-bit, and go from 256 shades of gray to 1024 — increasing spatial resolution rather than just physical resolution.

With a better resolution, he says, a camera can receive more information and do a more detailed inspection. “If I was looking at something that was supposed to be plus-or-minus one thousandth of an inch, and I had 1,000 pixels, that’s plus-or-minus one pixel. You really can’t depend on that,” said Schwarzbach. “But if I had 3,000 pixels, now I have 3 pixels to play with.”

Software

Imaging software is another evolving component of camera technology; software has shifted from code dedicated to a particular function, to more customizable options. Instead of buying piecemeal library modules, which can then be hooked together, companies now provide pliable system software for customized business-specific applications.

“Customers will pay a system house, asking for a specific imaging system that performs functions X, Y, and Z, and the system house will take the software package, set it all up, and switch it on,” said Morse.

As the software part of the market has developed, there are now companies that specialize in writing software for machine vision applications. “It’s not a huge sector of the industry, but it’s growing, and it’s becoming more and more important, because the processing of the data is as important very often as creating the image in the first place,” he said.

Software also offers integrators like Schwarzbach improved control over imaging settings. Using PPT's Control Panel Manager, for example, buttons can be linked on a control panel or HMI — the same human-machine interfaces that display total passes and eminent fails — to any control inside a vision system. With software, a user can enable different intrinsic features of a camera, including gain shutter speed and lighting, from within the program. Specific settings can be used for different jobs.

Schwarzbach can click a button, for example, and find the brightness level that is most appropriate, and then have the program remember the particular adjustments. “You couldn’t do that five years ago,” he said.

The use of laser light and 3D is also a major development — not a giant sector of the market, according to Morse, but an important technical one. Software has developed in a way that makes 3D image processing possible.

An image system developed by the Burlington, MA-based Visidyne, Inc., for example, measures recurring modulated light at three phases in time to get three images. By analyzing the subtle changes in light reflectance, distance, and brightness in those three images, a 3D shape of an object can be determined and rendered.

“What’s really made it all happen,” said Morse, referring to the growth in 3D vision technology, “is the development of software to handle those images and to assimilate the data and create a 3D image.”

The specific software capabilities have improved as well. “Software has gotten better. Pattern finding tools and edge-finding tools have gotten faster and more accurate,” said Schwarzbach. A pattern tool, for example, can locate the center of a part, and check the parts’ edges for surface blemishes and pinholes (see Figure 3). A contour tool, similarly, sits and tracks the outside form of a product, ready to warn the user if there are malformations, flash, or short shots to the left or right of an edge.

Companies, in turn, are providing the whole package: cameras, the wiring, input/output, and the mini-computer. The trend has shifted from a machine that performed the full operation to a PC-based or processor-based system where users can easily take advantage of the new software.

“Now many are running a machine vision processor on a PC platform, so that you can run both vision software and any statistical software in the same box,” said Schwarzbach.

Dawson sees a steady migration of intelligence into cameras, and predicts their future use in applications beyond industry and manufacturing. “They’re going to be going into non-traditional machine vision kinds of applications, such as collision detection in cars or gaming,” he said, noting the Microsoft Kinect, a Microsoft motion sensing input device for the Xbox 360 video game console and Windows PCs. The technology, based around two cameras and a pattern of projected infrared light, enables players to interact without the need to touch a game controller.

“We’ve seen it in the Microsoft Kinect. It’s a smart box with cameras and other sensors in it. You’re moving the intelligence downstream so the Xbox can concentrate on the game play rather than processing pixels,” he said.

Meeting Customer Expectations

Customers have heightened their expectations. Integrators like Schwarzbach run into scenarios where customers want more and more from their systems.

“They don’t expect a vision system to be able to just tell you that there’s a nut on the bolt,” said Schwarzbach. “They want to be able to make sure that there aren’t any cracks, and there aren’t any chips, and to measure something that’s a couple of thousandths of an inch, located anywhere in a 6- to 8-inch field of view.”

As customers are more demanding in what is wanted and expected, camera manufacturers have been tuning their speed, resolution, and software to accommodate the needs of their traditional manufacturing customers and new non-traditional users alike.

This article was written by Billy Hurley, Associate Editor, NASA Tech Briefs. Contact This email address is being protected from spambots. You need JavaScript enabled to view it. for questions or more information.

White Papers

Noncontact Differential Impedance Transducer
Sponsored by Kaman
The Ultimate Shaft-To-Hub Connection
Sponsored by Stoffel Polygon
When Does A Solar Energy System Make Sense for Remote Site Power?
Sponsored by SunWize
Reliability Testing of GORE® Protective Vents in LED Luminaires
Sponsored by Gore
PICO Brochure
Sponsored by Nordson EFD
Force and Torque Measurement Traceability
Sponsored by Morehouse

White Papers Sponsored By: