Sweeping the Environment

In order to be useful for autonomous driving, lidar has to sweep the environment to produce a useable 3D map. Velodyne has two different technologies for that. One is their Surround View platform, which achieves a 360° view around the sensor by using a solid-state electronic lidar engine, which is essentially rotated on a spindle.

The other platform is a small form factor embeddable lidar called Vellaray™, which uses frictionless beam steering that can be swept in two axes. Since it is not physically rotated however, its field of view is limited to 120° in the horizontal plane.

For both platforms, the vertical sweep angle is 40°. Looking forward, you want to be able to see a little bit higher than the horizon even if you're on an up or down slope. You also need to sense objects like road signs and overpasses. The general consensus in the industry is that about a 40-degree vertical field of view is more than adequate for these functions.

Sensing the Environment

Figure 3. Velodyne's Surround View platform achieves a 360° view around the sensor by using a solid-state electronic lidar engine, which is essentially rotated on a spindle.

The resolution of modern lidar systems is good enough to distinguish between various objects, for example, a pedestrian vs. a bicycle. Besides distance, lidar can also probe the reflectivity of an object. Stop signs are high — the word “stop” is in white whereas the background is in red. Thus, the system cannot just detect the presence of a sign, but by virtue of the reflectivity, to determine its nature as well. Also, lane markings are more reflective than road surfaces, and so on.

This allows lidar to provide a view of the world independent of cameras and radar, although there will probably not be any attempt to replace them. There will always be a need for a redundant sensor modality as a backup in case the lidar system fails, to make sure that the vehicle is able to safely pilot itself or at least bring itself to a safe stop.

Software

I then asked Gopalan about software. He replied that as the speed of autonomous vehicles increases, the code processing unit inside these systems, which is a pretty big computer at the moment, really has precious little time to crunch all the data and make decisions. Because of that, a greater amount of the processing will be done in the sensor itself. Rather than just providing raw data, in the future, analytics embedded in the lidar will directly provide location information. Gopalan also believes that most problems will continue to be solved with traditional algorithms rather than artificial intelligence. Since algorithms are deterministic, if a problem occurs the cause can be more easily traced.

The Future

Autonomous vehicles and ADAS are poised to become standard technologies in the next few years, thus increasing the need for improved sensing technologies. Lidar is becoming increasingly sophisticated and will be a major partner in automotive sensing along with cameras and radar.

This article was written by Ed Brown, Associate Editor of Photonics & Imaging Technology. For more information, visit here .

Photonics & Imaging Technology Magazine

This article first appeared in the July, 2019 issue of Photonics & Imaging Technology Magazine.

Read more articles from the archives here.