Whether it's on top of a self-driving car or embedded inside the latest gadget, Light Detection and Ranging (LiDAR) systems will likely play an important role in enabling vehicles to see in real time, phones to map three-dimensional images, and enhancing augmented reality in video games. The challenge is that these 3D imaging systems can be bulky, expensive, and hard to shrink down to the size needed for new applications.
Researchers have created a new silicon chip — with no moving parts or electronics — that improves the resolution and scanning speed needed for a LiDAR system. Big, bulky, heavy LiDAR systems could be replaced with a flat chip.
Current commercial LiDAR systems use large, rotating mirrors to steer the laser beam and thereby create a 3D image. A new way of steering laser beams is wavelength steering where each wavelength or “color” of the laser is pointed to a unique angle. The researchers developed a way to accomplish this along two dimensions simultaneously with color using a “rainbow” pattern to take 3D images. Since the beams are easily controlled by simply changing colors, multiple phased arrays can be controlled simultaneously to create a bigger aperture and a higher-resolution image.
LiDAR is a remote sensing method that uses laser beams (pulses of invisible light) to measure distances. The beams of light bounce off everything in their path and a sensor collects these reflections to create a precise, three-dimensional picture of the surrounding environment in real time.
LiDAR can tell how far away each pixel is in an image — it has been used for decades in satellites and airplanes to conduct atmospheric sensing and measure the depth of bodies of water and heights of terrain.
In order to work broadly in the consumer market, LiDAR must become even cheaper, smaller, and less complex. The simpler and smaller silicon chips can be made — while retaining high resolution and accuracy in their imaging — the more technologies they can be applied to including self-driving cars and smartphones.
For more information, contact Daniel Strain at