Researchers have developed a compact 3D LiDAR imaging system that can match and exceed the performance and accuracy of most advanced mechanical systems currently used. 3D LiDAR can provide accurate imaging and mapping for many applications; for example, it is the “eyes” for autonomous cars and is used in facial recognition software and by autonomous robots and drones. Accurate imaging is essential for machines to map and interact with the physical world but the size and costs of the technology currently needed has limited LiDAR’s use in commercial applications.
The new, integrated system uses silicon photonic components and CMOS electronic circuits in the same microchip. The prototype would be a low-cost solution and could pave the way to large-volume production of low-cost, compact, and high-performance 3D imaging cameras for use in robotics, autonomous navigation systems, mapping of building sites to increase safety, and in healthcare.
The silicon photonics system provides much higher accuracy at distance compared to other chip-based LiDAR systems and most mechanical versions. Tests of the prototype show that it has an accuracy of 3.1 millimeters at a distance of 75 meters.
The combination of high performance and low-cost manufacturing could accelerate existing applications in autonomy and augmented reality, as well as open new directions such as industrial and consumer digital twin applications requiring high depth accuracy, or preventive healthcare through remote behavioral and vital signs monitoring requiring high-velocity accuracy.
Among the problems faced by previous integrated systems are the difficulties in providing a dense array of pixels that can be easily addressed; this has restricted them to fewer than 20 pixels whereas this new system is the first large-scale 2D coherent detector array consisting of 512 pixels. The researchers are working to extend the pixel arrays and the beam steering technology to make the system even better suited to real-world applications and further improve performance.
For more information, contact Peter Franklin at