Few technologies have impacted the scientific community – and non-scientific community for that matter – as much as digital imaging technology. From exotic, high-speed imaging systems to rugged machine vision systems to a vast array of sophisticated consumer devices, digital cameras are everywhere these days, documenting every aspect of this world we live and work in. Photonics Tech Briefs recently spoke with executives from four well-known imaging companies to get their perspectives on where imaging technology is today, and where it is going in the future.
Our roundtable panel members are Michael Bode, Ph.D., CEO of Ximea Corp.; Robert LaBelle, Ph.D., VP of Marketing for Photometrics and QImaging; Dany Longval, VP of Worldwide Sales for Lumenera; and Andrew Bridges, Director of Sales & Marketing for Photron.
Photonics Tech Briefs: What constitutes current state-of-the-art in high-speed imaging technology?
Dany Longval: There are two main factors that are driving state-of-the-art in high-speed imaging technology. The first is the frame rate at which cameras are capable of taking images. Over the past few years high-speed sensors, such as the Sony ICX674, combined with optimal engineering design, have enabled up to 53 fps at a high resolution. However, being able to capture the image is only one part of the challenge, which leads to the second factor – interface data rate transfer. Until recently, data transfer was limited to sub-5 GB/s speeds unless you were using a proprietary connection. With the introduction of USB 3.0, higher speed transfer of images at up to 5 GB/s to the computer for analysis and storage is possible.
Michael Bode: The question about state-of-the-art high speed imaging has multiple answers. For some applications, the focus is on the fastest exposure time. For these applications there are cameras that allow exposure times in the range of 100s of picoseconds. To allow recording with the highest frame rates, these cameras typically avoid the transfer of the images to a host computer, and record the image sequences directly to the memory of the camera, resulting in limited recording time. For cameras that use a host computer and need to transfer the imaging data for storage, the bottle neck is the transfer medium. Currently there are 4 competing interfaces that allow high speed imaging: Cameralink, Coaxpress, USB3, and PCIe.
Robert LaBelle: As CMOS noise and sensitivity improve, CMOS is now challenging scientific grade CCDs, even EMCCDs, in many difficult low light applications. A good example for scientific imaging are BAE’s sCMOS devices with over 100 frames per second (fps) at 4MP, and a noise floor of only a few electrons.
Andrew Bridges: Framing speed and camera size are probably the two key areas that are reflected in the latest generation of high speed cameras.
PTB: CCD or CMOS – which is the dominant technology today, and which has the most potential for the future?
Bridges: Definitely CMOS is the technology the high speed camera manufacturers are concentrating on. In addition to the greater number of CMOS fabrication facilities that exist worldwide, there is the obvious benefit of no blooming CMOS provides over CCDs. When a CCD pixel is overexposed, the electrons can ‘corrupt’ neighboring columns, resulting in blooming or tearing.
Longval: CMOS is the dominant technology by far but the market data needs to be analyzed carefully. CMOS is dominating consumer applications, but when it comes to industrial and scientific imaging there are applications that are better suited for CMOS and applications that are better suited for CCD. One area where CCD was dominant until recently had been in imaging moving objects because of the intrinsic global shutter nature of CCD sensors. With the introduction of global shutter capabilities, CMOS is now starting to close that gap. Another area where CMOS is closing the gap is with a technology called scientific CMOS. Scientific CMOS combines very low noise performance with high speed capabilities making it a good choice for a number of scientific imaging applications. However, the technology is relatively complex, difficult to use and very expensive.
LaBelle: CCDs are still preferred in long-stare scientific applications like Chemiluminescence due to near perfect photometry, high image quality, high sensitivity and low dark current. Over time, we see CMOS moving to the forefront across all scientific imaging applications as their limitations are addressed.
Bode: Overall, there is no question that CMOS has a far larger market share for area imaging sensors than CCD. In 2012, CMOS sensors had a 92% market share, due to the large consumer market, and the advantage that CMOS can be processed in the same manner as today’s microelectronics, leading to price advantages. CCD sensors typically produce the better images with lower noise, higher dynamic range and sensitivity. CMOS sensors require more electronics close to the active pixel area. This results in more “dead” area for light acquisition, which in turn leads to a sensitivity advantage for CCDs for similar sensors. Newer developments in CMOS technology use back-illuminated CMOS sensors that eliminate this sensitivity disadvantage. A new, exciting variant of CMOS are the sCMOS (scientific CMOS) sensors that combine very low noise with fast readout and high dynamic range, traditionally the strong points of CCD sensors.
PTB: Are there any promising new imaging sensor technologies on the horizon?
LaBelle: Many in fact. The flexibility of CMOS is leading to specialized detectors with multispectral imaging capabilities, sensors capable of 3D imaging, and availability of a wide range of sensor formats that go far beyond what CCD offered. SPAD sensors – arrays of avalanche photodiodes – are also very interesting for high frame rate scientific imaging due to the promise of true single photon imaging.
Longval: The rate of innovation in sensor technology is tremendous. Every year we see a number of new sensors coming to market. In recent years we have seen scientific CMOS and global shutter CMOS. There have also been major improvements to CCD technology to improve sensitivity or increase speed.
Bridges: We are always looking at new technologies that can improve our speed, light sensitivity, noise levels, etc. These might include back-illumination systems, ISIS technology, among many more.
Bode: While there are no completely new technologies on the horizon that could replace CCD or CMOS completely in the near future, there are some new technologies that could significantly increase the range of applications for digital cameras [such as] light-field imaging. Virtually all cameras today produce images with a more or less narrow depth of focus. If the image captured was not in focus, or focused on the wrong distance, the image is without value. Light-field imaging can correct this disadvantage. [And] hyperspectral imaging, [which] uses various different technologies to create multiple images in narrow spectral bands. The spectral information can then be used to infer information about various aspects, such as the health of vegetation or the composition of materials.
PTB: Rolling shutter vs. global shutter – what are the relative strengths and weaknesses of each? Which applications best suit each type of shutter?
Bridges: To my mind, a high speed camera has to utilize a global shutter. A rolling shutter is limited to whatever the framing rate is, i.e., a one thousand frame per second (1K fps) can only provide shuttering at 1/1,000th of a second, or 1ms. Plus pixels are typically exposed one row at a time, meaning if something within the image is moving fast, it will spatially shift one row from the next, producing some pretty funky results.
LaBelle: The knock on rolling shutter is related to geometric distortion from motion. As cameras with rolling shutters increase in sensitivity and frame rate, the level of rolling shutter distortion often drops to insignificant levels. In the case of fluorescence microscopy, the distortion can also be mitigated by “global shuttering” illumination. There are cases where the impact of motion can’t be mitigated and the downsides of global shutter – lower frame rates and slightly high noise – are acceptable. We think both modes will be relevant for some time.
Longval: A rolling shutter sensor can be very fast and is very affordable to manufacture thanks to its simpler transistor structure. They also offer relatively good low noise performance – not as good as CCD but better than global shutter CMOS. The problem with a rolling shutter is the image artifacts generated from any moving objects within the field of view or from camera movement. For applications where fast motion is part of the application, a global shutter sensor, be it CCD or CMOS, becomes a must have.
Bode: The global shutter of CCD sensors and the rolling shutter of CMOS sensors used to offer clear distinctions between the two types of sensors, but since many CMOS cameras now offer global shutter modes, the distinction has become less obvious. In global shutter mode, all pixels of a sensor are exposed at the same time and for the same duration, while in rolling shutter mode the lines of a sensor are exposed at different starting times (sort of like a trigger wave rolling across the sensor). The most obvious effect of this is that a fast moving object can shift position between the times this wave moves from one line to the next, which results in a distorted object in the final image. If the sensor is a color sensor, this can also lead to noticeable color fringes around the objects, and for handheld devices it leads to a “jelly effect” where objects wobble as if they were made from jelly. On the other hand, the noise performance of rolling shutter cameras can be better than that of global shutter cameras.
PTB: Looking into the future, what area do you predict will see the next big breakthrough in high-speed/scientific imaging technology?
Longval: I would say embedded vision is going to be a major growth market in the future. As technology evolves, imaging will find its way all around us: inside toys, automobiles, medical instruments, our houses, everywhere. The processing capacities found in embedded devices will be able to handle the high data associated with high speed image sensors. High end imaging will get out of the lab and find its way into our daily lives.
LaBelle: Certainly today’s scientific microscopy cameras already move between traditional high fidelity image capture, to collecting data that is a representation of underlying biochemistry, images that represent data that could never be visualized by eye. The use of increasingly sophisticated optical systems combined with computational approaches will take microscopy and scientific imaging into new realms, where rich data is extracted from the light field beyond working in low light and photometric imaging.
Bode: For high-speed cameras connected to a host computer, the bandwidth of the connection is the biggest bottleneck. To significantly increase the bandwidth, new protocols and connections need to be developed. With the current PCIe interface or Thunderbolt technology, we can reach 20+ Gbps, slightly more with special hardware (Coaxpress). But this is not the end. PCIExpress 3.0 offers a throughput of roughly 8 Gbps per data lane, and each PCIe connection can have up to 16 data lanes, which would theoretically allow 128 Gbps. To put that in perspective, this would allow a 1MPixel sensor to transmit 16,000 frames per second.
Bridges: Concerning the broader high speed market, I think the desire for ever higher resolutions at speeds around or even exceeding one million frames per second – at a price that will not break the bank – is one area that is ripe for change. [Another] is ever improving light sensitivity, ideally with a standardized way of every manufacturer presenting their real light sensitivity to one standard such as ISO 12232 Ssat, as opposed to the ad hoc system some use now.