Time of flight is an approach that gauges distance by measuring the time it takes light projected into a scene to bounce back to a sensor. In time-of-flight imaging, a short burst of light is fired into a scene, and a camera measures the time it takes to return, which indicates the distance of the object that reflected it. The longer the light burst, the more ambiguous the measurement of how far it's traveled. Light-burst length is one of the factors that determines system resolution.

Comparing the cascaded GHz approach with Kinect-style approaches visually represented on a key. From left to right: the original image, a Kinect-style approach, a GHz approach, and a stronger GHz approach.

The other factor is detection rate. Modulators that turn a light beam off and on can switch a billion times a second, but today's detectors can make only about 100 million measurements a second. Detection rate is what limits existing time-of-flight systems to centimeter-scale resolution.

There is, however, another imaging technique that enables higher resolution: interferometry. In this technique, a light beam is split in two, and half of it is kept circulating locally while the other half — the “sample beam” — is fired into a visual scene. The reflected sample beam is recombined with the locally circulated light, and the difference in phase between the two beams — the relative alignment of the troughs and crests of their electromagnetic waves — yields a very precise measure of the distance the sample beam has traveled. But interferometry requires careful synchronization of the two light beams.

A new approach to time-of-flight imaging was developed that increases its depth resolution 1,000-fold, which could make self-driving cars practical. The new approach could also enable accurate distance measurements through fog, which has proven to be a major obstacle to the development of self-driving cars.

Gigahertz optical systems are naturally better at compensating for fog than lower-frequency systems. Fog is problematic for time-of-flight systems because it scatters light — it deflects the returning light signals so that they arrive late and at odd angles. Trying to isolate a true signal in all that noise is too computationally challenging to do on the fly.

With low-frequency systems, scattering causes a slight shift in phase that simply muddies the signal that reaches the detector. But with high-frequency systems, the phase shift is much larger relative to the frequency of the signal. Scattered light signals arriving over different paths will actually cancel each other out; the troughs of one wave will align with the crests of another. Theoretical analyses suggest that this cancellation will be widespread enough to make identifying a true signal much easier.

Tests were conducted in which a light signal was sent through 500 meters of optical fiber with regularly spaced filters along its length to simulate the power falloff incurred over longer distances, before feeding it to the system. Those tests suggest that at a range of 500 meters, the system should still achieve a depth resolution of only a centimeter.

The system also uses some ideas from acoustics. Anyone who's performed in a musical ensemble is familiar with the phenomenon of “beating.” If two singers are slightly out of tune — one producing a pitch at 440 hertz and the other at 437 hertz — the interplay of their voices will produce another tone, whose frequency is the difference between those of the notes they're singing; in this case, 3 hertz.

The same is true with light pulses. If a time-of-flight imaging system is firing light into a scene at a billion pulses a second, and the returning light is combined with light pulsing 999,999,999 times a second, the result will be a light signal pulsing once a second — a rate easily detectable with a commodity video camera. And that slow “beat” will contain all the phase information necessary to gauge distance.

Rather than try to synchronize two high-frequency light signals — as interferometry systems must — the returning signal was moderated using the same technology that produced it in the first place; the already pulsed light was pulsed. The result is the same, but the approach is much more practical for automotive systems.

For more information, contact Abby Abazorius at This email address is being protected from spambots. You need JavaScript enabled to view it.; 617-253-2709.