Traditional manual focus on infrared cameras provides good control, but users are often unable to tell whether they are focusing the infrared image correctly. Old-style auto-focus systems, based on the straight-edge detection algorithms made popular by point-andshoot digital cameras, often do not work well for focusing on infrared wavelengths. Heat in the physical world does not always travel in the straight lines that the human eye is used to seeing.
The newest and most exciting advancement in thermal imager focusing is the incorporation of a laser range-finding device into the focusing mechanism. The device sends out a laser pointer beam and receives the signal back at the speed of light, then calculates that precise distance and tells the infrared optical and sensor system to focus on that exact spot. At that point, mathematics take over and calculate a corresponding optical focal position for that point in space. The resulting system is both faster and more accurate, much to the surprise and delight of experienced thermographers, long accustomed to endless minute adjustments of the focus wheel (see Figure 3).
A Systems Engineering Approach
Achieving this technological integration has required extensive coordination between disparate engineering specialties. Thermal engineering begins at the product concept and does not end until the product is able to be manufactured at high yields. At product concept, industrial design engineers take into account product goals, human factors, and customer feedback.
The infrared signal path is one of the most complex parts of the design. Unlike a digital camera, it is often impossible to isolate the stray energy inside the camera housing and electronics. Thus, a good design needs to take the stray energy into account and effectively calibrate it out. The engineering team not only designs the hardware, software, and mechanics of the thermal imager itself, but also the manufacturing test stations. The calibration station takes numerous measurements across both the ambient temperature range and the target temperature range, which is then fitted to a calibration profile. The technology not only calibrates each pixel of the infrared sensor, but also simultaneously calibrates the full electronics in final form. The calibration method is created collaboratively between the electro-optical engineers, the manufacturing engineers, and the system architect who is responsible for the radiometry, or the understanding of how the energy transfers through the system.
A calibrated sensor and associated electronics are only part of the technology. The laser focus feature discussed earlier, for example, requires integration of the imager’s infrared core, the laser, the electronics, the focus motor, and the software. It is not as simple as the mechanical engineer choosing a focus motor. Instead, the engineer must work closely with the rest of the team to find a motor that has the right level of controls and can be interfaced with the software. The focus motor must be able to make small adjustments (during calibration) as well as large adjustments (during customer- driven focus), and thus must have a precise interface to the software running on the camera. In other words, the software must be able to move the focus quickly for large focus changes, but also be able to move the focus very slightly and accurately for small focus changes. In addition, this solution must be robust enough to support a two-meter drop specification along with other environmental specifications.
Today’s thermal imagers have a very complex software system containing several modules. The hardware/software interface may be managed with a processor core (such as Linux®). The data processing of the infrared energy (analyzing and displaying it) may be done with an FPGA. The software engineers must work closely with hardware engineers to make the right choice of both electronic components and where the different pieces of software must reside. Image processing done on the FPGA, for example, may consume too much power and thus reduce battery life; if it is moved to the processor, however, a more complex interface to the processor may be needed. Another example is the decision-making behind combining the visual and infrared images when displayed on the screen. Does it make sense to combine at the processor level, or wait until the infrared image is processed by the FPGA in the format needed for the display? These tradeoffs are not easy to make, and require the integrated team to work very closely.
When Complexity is Worth it
By adding precision electronics and laser optics to thermal imagers, engineering teams have managed to greatly expand the technology’s applications. The tool has had to become both more intuitive as well as more precise: simpler and more complicated, at the same time. A thermal image alone did not provide enough information for the technology to reach beyond expert thermographers. Blending the thermal and digital image suddenly gave users the context to apply thermal vision to their environments. Similarly, a truly functional auto-focus was necessary for average users to get images that were clear and accurate enough to be broadly useful. The development cycle is far from complete; more innovations are sure to come.
The article was written by Michael Stuart, Sr. Product Marketing Manager at Fluke Corp., and Jeff Abramson, Director of Thermography Development at Fluke Corp. For more information, visit http://info.hotims.com/45611-155.