Thermal imagers allow a user to see an object’s heat signature, and heat provides an entirely different set of performance data than the visible spectrum available to the naked eye. A fully radiometric camera will calculate a temperature value for every pixel seen on screen. The technician uses the thermal colors on screen to look for differences in temperature, between previous states or like components, without actually coming into direct contact with the device under test.
With a capable infrared camera and an understanding of thermodynamics, the user can discover and often diagnose issues that are unseen. Infrared thermal imaging is typically more accessible and more affordable than radiography, acoustic ultrasound, eddy current analysis, vibration analysis, and other advanced inspection technologies. As a result, thermography has become a mainstay of manufacturing, industrial and commercial maintenance, research and development, and materials analysis.
In this article, we will explore the importance of ease of use, enhanced image quality, and an integrated development approach in today’s thermal imaging applications. Complexity, cost, and image quality used to be barriers to use. Now, thanks to new improvements, a completely different and far broader set of users can take advantage of infrared technology.
Ease of Use
The evolution of thermal imaging systems follows that of most electronics: Capabilities, accuracy, sophistication, and ease of use have increased as size and weight have decreased. Early systems used cooled sensors to get the best thermal image, requiring tanks of liquefied gas, such as nitrogen, to be present.
An imager using earlier technology was both more expensive and difficult to operate in a portable environment, due to size, power consumption, and warmup time. Infrared imagers have gone from laboratory bench-top, to truckmount, to heavy portable, to lightweight portable. While not as mature as other semiconductor devices, such as the microprocessor, a handheld thermal imager is nonetheless a complex device, using precision electronics to acquire and process the infrared signal.
Multi-spectral image combination tech nology also now adds an infrared image with a visible light image in nearperfect alignment: The user can see the device under test through a partiallytransparent infrared image (see Figure 1). The technology provides visual context for where a temperature abnormality is located in the physical environment. The capability transfers through to the captured images, so that a third party not present at the original point of measurement can feasibly interpret the data. Captured images can be shared wirelessly and through direct download to PC, and then further manipulated in software.
As today’s electronic devices become increasingly wireless and more interconnected, it is not surprising that infrared and other measurement devices are following suit. Many newer thermal imaging systems now connect wirelessly with other measurement tools, such as current clamp meters, contact temperature meters, and multimeters. The resulting information provides additional context about why a thermal anomaly may be occurring, and corroborating evidence about the root cause, thus producing a more credible diagnosis.
The biggest challenge to infrared image quality has not been pixel resolution, but rather image focus (see Figure 2). If your image is out-of-focus, important details, of course, will be blurred or might even be unrecognizable. For an infrared camera, however, there is an added level of impact, in that the focus also affects the temperature measurement calculation. The imager does not actually measure temperature; it calculates temperature based upon the amount of infrared energy being focused on to the microbolometer, along with the input of some other variables (emissivity, transmissivity, reflected background temperature, etc.). If the focus is off, less infrared energy is detected and registered by the sensor and subsequently used for radiometric calculations. Therefore, the displayed apparent temperature could appear lower than it would if the image is in proper focus.
Traditional manual focus on infrared cameras provides good control, but users are often unable to tell whether they are focusing the infrared image correctly. Old-style auto-focus systems, based on the straight-edge detection algorithms made popular by point-andshoot digital cameras, often do not work well for focusing on infrared wavelengths. Heat in the physical world does not always travel in the straight lines that the human eye is used to seeing.
The newest and most exciting advancement in thermal imager focusing is the incorporation of a laser range-finding device into the focusing mechanism. The device sends out a laser pointer beam and receives the signal back at the speed of light, then calculates that precise distance and tells the infrared optical and sensor system to focus on that exact spot. At that point, mathematics take over and calculate a corresponding optical focal position for that point in space. The resulting system is both faster and more accurate, much to the surprise and delight of experienced thermographers, long accustomed to endless minute adjustments of the focus wheel (see Figure 3).
A Systems Engineering Approach
Achieving this technological integration has required extensive coordination between disparate engineering specialties. Thermal engineering begins at the product concept and does not end until the product is able to be manufactured at high yields. At product concept, industrial design engineers take into account product goals, human factors, and customer feedback.
The infrared signal path is one of the most complex parts of the design. Unlike a digital camera, it is often impossible to isolate the stray energy inside the camera housing and electronics. Thus, a good design needs to take the stray energy into account and effectively calibrate it out. The engineering team not only designs the hardware, software, and mechanics of the thermal imager itself, but also the manufacturing test stations. The calibration station takes numerous measurements across both the ambient temperature range and the target temperature range, which is then fitted to a calibration profile. The technology not only calibrates each pixel of the infrared sensor, but also simultaneously calibrates the full electronics in final form. The calibration method is created collaboratively between the electro-optical engineers, the manufacturing engineers, and the system architect who is responsible for the radiometry, or the understanding of how the energy transfers through the system.
A calibrated sensor and associated electronics are only part of the technology. The laser focus feature discussed earlier, for example, requires integration of the imager’s infrared core, the laser, the electronics, the focus motor, and the software. It is not as simple as the mechanical engineer choosing a focus motor. Instead, the engineer must work closely with the rest of the team to find a motor that has the right level of controls and can be interfaced with the software. The focus motor must be able to make small adjustments (during calibration) as well as large adjustments (during customer- driven focus), and thus must have a precise interface to the software running on the camera. In other words, the software must be able to move the focus quickly for large focus changes, but also be able to move the focus very slightly and accurately for small focus changes. In addition, this solution must be robust enough to support a two-meter drop specification along with other environmental specifications.
Today’s thermal imagers have a very complex software system containing several modules. The hardware/software interface may be managed with a processor core (such as Linux®). The data processing of the infrared energy (analyzing and displaying it) may be done with an FPGA. The software engineers must work closely with hardware engineers to make the right choice of both electronic components and where the different pieces of software must reside. Image processing done on the FPGA, for example, may consume too much power and thus reduce battery life; if it is moved to the processor, however, a more complex interface to the processor may be needed. Another example is the decision-making behind combining the visual and infrared images when displayed on the screen. Does it make sense to combine at the processor level, or wait until the infrared image is processed by the FPGA in the format needed for the display? These tradeoffs are not easy to make, and require the integrated team to work very closely.
When Complexity is Worth it
By adding precision electronics and laser optics to thermal imagers, engineering teams have managed to greatly expand the technology’s applications. The tool has had to become both more intuitive as well as more precise: simpler and more complicated, at the same time. A thermal image alone did not provide enough information for the technology to reach beyond expert thermographers. Blending the thermal and digital image suddenly gave users the context to apply thermal vision to their environments. Similarly, a truly functional auto-focus was necessary for average users to get images that were clear and accurate enough to be broadly useful. The development cycle is far from complete; more innovations are sure to come.
The article was written by Michael Stuart, Sr. Product Marketing Manager at Fluke Corp., and Jeff Abramson, Director of Thermography Development at Fluke Corp. For more information, Click Here .