Picture a camera designed to capture images with a resolution of a single photon at 24,000 frames per second. Thanks to an electronic shutter that can stay open for only 3.8ns and that can be synchronized with fast laser pulses with a duration of only a few picoseconds, one can literally see light propagating through space. With this capability, new applications open up such as quantum vision, ghost imaging, sub-shot-noise imaging, quantum LiDAR, and quantum distillation, to name a few.
Common to these applications is the need for single-photon detection and high timing resolution with low noise and high sensitivity. This new camera is capable of all that but, in addition, it does it on a million pixels simultaneously, thus enabling considerable speedup in capture and, possibly, in reconstruction. At the core of these pixels there is a single-photon avalanche diode (SPAD) that performs photon detection, generating a digital pulse. This pulse can be counted or timestamped, thus giving the photon counting and time-resolved character of the camera.
In a paper published in the peer-reviewed journal Optica, we presented the first 1-Mpixel camera based on the SPAD pixels. The pixels have a pitch of 9.4 μm, with a 7T (7 transistors per pixel) or 5.75T architecture. The micrograph of the camera chip is shown in Figure 1(b). The camera block diagram is shown in Figure 1(a); it comprises a dual binary tree for controlling the shutter with a precision of about 100ps and a position tunable in steps of 36ps. This enables one to reconstruct 3D images by measuring the time-of-flight of a light pulse as it leaves the laser and is reflected by an object. It also enables one to capture light-in-flight, thus exposing interesting relativistic effects in the laboratory.
The chip was tested as an intensity image sensor with a standard chart ( Figure 2(a)) at up to 14-bits at pixel level. Figures 3(a) and (b) show a 2D and color-coded 3D pictures obtained by illuminating a scene with a 637 nm-laser pulsed at 40 MHz and captured on the half-resolution image sensor. The gate window with its length of 3.8 ns is shifted from 0.6 ns to 13.2 ns by steps of 36 ps to acquire full photon intensity profiles as a function of the gate position.
The distance LSB in this measurement corresponds to 5.4 mm. The depth information is reconstructed by detecting the rising edge position of the smoothed intensity profile for each pixel, corresponding to the time-of-arrival of the reflected laser pulse. The gate timing skew over the array is compensated by subtracting independently measured timing skew distribution from the measured time-of-arrival distribution. In Figure 3(b), red color denotes higher proximity to the SPAD camera, whereas blue color corresponds to higher distance. The maximum depth range for this measurement was set to 2m, but it can be extended to tens of meters by lowering the laser repetition frequency and increasing the gate step.
The fine gate scanning pitch and long exposure are used to achieve high depth precision, and the resulting data acquisition time for this measurement was few tens of seconds. This is considerably longer than that of other ranging methods such as indirect time-of-flight, but it can be readily reduced by increasing the gate scanning pitch, reducing the scanning range, and increasing the laser power to reduce the exposure time. In addition, further improvement is expected by implementing an on-chip microlens to boost the sensitivity.
Figure 3(c) shows the measured distance as a function of the actual object distance. In Figure 3(c), (d) and (e), a flat object covered with white paper (reflectance around 60%) is used to evaluate the measured distance, accuracy and precision. In Figure 3(c), the measured distance is extracted by taking the average of the single pixel distance over 20×20 pixels at the center of the array. A very good agreement with the actual distance is observed within the measured range from 0.2 to 1.6m. In Figure 3(d), the distance accuracy is calculated as the averaged measured distance subtracted by the actual distance. For the measured distance range, the accuracy is always better than 1 cm. In Figure 3(e), distance precision is exploited as a standard deviation of the single pixel distance over 20×20 pixels in the center of the array. The precision is better than 7.8 mm (rms) for all the measured points up to 1.6m.
Multi-object detection has been experimentally demonstrated by either coding temporal illumination or exposure patterns, which involves a large computational cost to recover 3D images. A time-gated time-of-flight sensor provides an alternative, scalable solution by means of compact pixel circuitry and less complicated computation.
Figure 4(a) shows the experimental setup: 510-nm laser beam pulsed at 40 MHz is spread by a diffuser and used to illuminate a spherical target. The SPAD camera is synchronized with the laser triggering signal, and a transparent plastic plate is inserted between the camera and the object. The distances from the camera to the plastic plate and the object are 0.45m, and 0.75m, respectively. Figure 4(b) shows 2D intensity images under indoor lighting with and without the plastic plate inserted. Since the plate is almost transparent, no significant difference is observed in the 2D images for those two cases.
The measured time-gating profiles for three representative points (A, B and C) are plotted in Figure 4(c). Without the plate, the time-gating profiles for point A and B show only a single smoothed rectangular function waveform with its rising edge around gate position 100 (one step of the position corresponding to 36 ps). For point C, the photon count stays close to zero over the measured gate position range, indicating no reflective object is detected at this pixel.
With the plastic plate, by contrast, the profile at point A shows two-step rising edges around gate positions 40 and 100. Given that the measured profile of photon counts is a convolution of a single smoothed rectangular function and the reflected photon intensity distribution, the two-step profile is convincing evidence of double reflection from the plastic plate and the spherical object. Similar behavior is observed at point B, where the slope of the first rising edge around gate position 40 is milder than that of point A. The profile at point C shows only single rising edge around gate position 40, corresponding to the reflection from the plastic plate. The variation of the slope for the rising edge around gate position 40 between different points is induced by the non-uniform reflection from the surface of the plastic plate.
The results demonstrate the capability of a time-gated SPAD camera to perform spatially overlapped multi-object detection. Note that the proposed scheme can be applied to the detection of more than two reflection peaks. Finer scanning of the virtual gate window in postprocessing enables systematic detection of multiple peaks. The minimum resolvable distance between two neighboring reflective materials is fundamentally limited by the finite rising or falling time of the gate window profile, corresponding to 5-10 cm in this SPAD sensor.
In conclusion, a 1-Mpixel time-gated SPAD image sensor has been reported for the first time. In SPAD research, achieving a megapixel SPAD sensor has been considered one of the most important milestones for over a decade. The sensor is applied to high dynamic range 2D imaging and high spatio-temporal resolution 3D imaging. To the best of our knowledge, the spatially overlapped multi-object detection with single-photon time-gating scheme has been experimentally demonstrated for the first time.
Figure 5 shows a state-of-the-art comparison of SPAD pixel pitch and array size. The array size of the sensor is the largest, almost 4-times higher than that of the state-of-the-art sensor, while the pixel pitch is one of the smallest. Owing to its high resolution 2D and 3D imaging capabilities, the proposed sensor will be useful in a wide variety of industrial applications such as security, automotive, robotic, biomedical, and scientific applications, including quantum imaging and ultra-high-speed imaging.
This article was written by Edoardo Charbon, Professor, EPFL Switzerland (Lausanne, Switzerland). For more information, visit here .