Real-time 3D lidar is poised to be the third leg of the trifecta of sensor technologies enabling both advanced driver-assistance (ADAS) and autonomous vehicles. The other two pieces are cameras and radar. David Hall, CEO of Velodyne Lidar, Inc., invented the HDL-64 Solid-State Hybrid real-time 3D lidar sensor in 2007 and Velodyne has continued to develop lidar systems for the automotive market ever since. I discussed the background of the technology and its current and future prospects with CTO, Anand Gopalan.

Figure 1. Retirement communities are a potential first location for truly driverless cars.

Lidar was developed in the 60s, shortly after the invention of the laser. It measures distance by illuminating a target with a pulsed laser beam and measuring the return time or phase shift of the reflected pulse.

The development of real-time 3D lidar was a major step forward for the technology. Initially it was just used to create human-readable high-definition maps of the world. At some point however, Google, among others, realized that once you have this high definition map, you could use it for a variety of applications, such as autonomous driving. So, mapping and autonomy are very intricately linked. Autonomous vehicles are not just sensing, and avoiding, and reacting, to the environment, they're actually navigating through the world using a high definition map.

Where is 3D Automotive Lidar Now — And Where is it Heading?

There are two types of automotive applications for lidar. The first is for developing fully autonomous vehicles — SAE level 4 and 5 systems. We will probably initially see them in mobility-on-demand (MOD) fleets, which are operated by ride-share companies like Uber or Lyft. Even some OEMs have expressed the desire to have their own autonomous vehicle ride-share services.

The other application, which is attracting a lot of interest, is for advanced driver assist systems (ADAS). Although initially these systems performed simple functions like emergency braking, lane-keep assist, and blind spot protection, they are moving toward more advanced features, such as highway autopilot and a limited set of autonomous features. According to Gopalan, lidar is necessary to achieve good level 2+ and level 3 ADAS systems. Research and development fleets have been on the road since 2017 and Gopalan expects that by 2020 or 2021 we will see production of consumer cars with lidar integrated into ADAS systems.

For the mobility on demand, fully autonomous vehicle fleet markets, Gopalan expects a move from small fleets of a few tens of vehicles to thousands in the next two to three years.

Why Ride-Share Fleets?

I asked Gopalan why the initial market for autonomous vehicles will be for ride-share and mobility-on-demand fleets. He answered that there are a couple of different ways to think about it.

First, there is the economics and the business case. Especially in dense urban areas, ride-sharing is becoming well accepted. The cost of ownership of an autonomous vehicle, even an economical one, is high. However, with a ride-sharing service, the cost of ownership per mile dips below the total cost of ownership, especially in an urban environment.

For average consumers, it is becoming increasingly attractive to subscribe to an autonomous ride sharing service where you can have access to a wide variety of vehicles depending upon your needs, and they will be available at your beck and call. The average person uses a vehicle maybe three or four hours of the day, so in this model, for the rest of the time, the vehicle would be available for other people to use.

From a technology perspective, fleets can be maintained far more regularly — the vehicles can even be serviced on a daily basis if needed. This allows OEMs and ride-share operators to have much better control to make sure that their autonomous vehicles are functioning up to specifications.

I then asked about the safety of deploying an autonomous vehicle in a crowded urban environment. “This is obviously a problem. A lot of the autonomous vehicle technology companies, such as Waymo, Cruise, and Uber, have been working on it for a few years,” said Gopalan. The key metric for determining the success of the technology, at least in the current and short term, is actually the speed of driving. Urban areas, even though they are denser and more complicated, tend to be lower speed environments. In the range of less than 40 miles per hour, those environments tend to be easier for autonomous systems to resolve than in a very high-speed environment like a highway. “So, I think that is the reason you're seeing a lot of the players focusing on low-speed urban environments, whether it's Waymo deploying in the streets of Phoenix, or Cruise testing in San Francisco, or Uber testing in Pittsburgh,” he said.

What Lidar Brings to the Sensor Trifecta

Figure 2. A lot of the players are focusing on low-speed urban environments, whether it's Waymo deploying in the streets of Phoenix, or Cruise testing in San Francisco, or Uber testing in Pittsburgh.

I asked Gopalan what 3D lidar can do that cameras and radar can't? “Cameras are relatively inexpensive and have high resolution, but they are a passive sensing technology. Since they are sensitive to ambient light, they don't perform uniformly for changing conditions, especially dawn to dusk to nighttime. So, since their performance is limited, you get a lot of variability in the results,” he said. Also, with a stereo camera system you get derived, rather than direct, information. With standard stereo, there are two cameras spaced apart at a certain distance. Then based on these 2D images and the angles between them, you derive 3-D, information. The problem is that as you go further out in distance, small errors in the stereo camera system will lead to large errors in the measured distance because the angles are so extreme.

“Radar, on the other hand, is what I call an active sensor,” he said. It senses the environment by sending and receiving electromagnetic waves. Since it is not sensitive to ambient light, it works well across any weather conditions. However, radar has limited resolution because of its wavelengths.

“Lidar is a really interesting technology because it sits in between the two other sensor modalities,” said Gopalan. Since it's a light-based technology, it has much higher resolution than radar and can see colors, reflectivity, and lane markings. It can sense context similar to a camera, but since it brings its own light to the party, it's not at all sensitive to ambient light conditions — it basically works the same whether it's dawn or dusk or night or daytime.

Lidar is also a direct, rather than a derived measurement of distance. It sends out a laser pulse and measures precisely when the pulse returns. Since the speed of light is constant, this is a direct and active measurement. The resolution is only limited by the wavelength of the light being used, which in the case of automotive lidar is the near infrared, with wave lengths in the range of hundreds of nanometers. That means you can easily resolve distances in the range of a millimeter or two. It's a very precise distance measurement, regardless of ambient light conditions or environmental noise.

Sweeping the Environment

In order to be useful for autonomous driving, lidar has to sweep the environment to produce a useable 3D map. Velodyne has two different technologies for that. One is their Surround View platform, which achieves a 360° view around the sensor by using a solid-state electronic lidar engine, which is essentially rotated on a spindle.

The other platform is a small form factor embeddable lidar called Vellaray™, which uses frictionless beam steering that can be swept in two axes. Since it is not physically rotated however, its field of view is limited to 120° in the horizontal plane.

For both platforms, the vertical sweep angle is 40°. Looking forward, you want to be able to see a little bit higher than the horizon even if you're on an up or down slope. You also need to sense objects like road signs and overpasses. The general consensus in the industry is that about a 40-degree vertical field of view is more than adequate for these functions.

Sensing the Environment

Figure 3. Velodyne's Surround View platform achieves a 360° view around the sensor by using a solid-state electronic lidar engine, which is essentially rotated on a spindle.

The resolution of modern lidar systems is good enough to distinguish between various objects, for example, a pedestrian vs. a bicycle. Besides distance, lidar can also probe the reflectivity of an object. Stop signs are high — the word “stop” is in white whereas the background is in red. Thus, the system cannot just detect the presence of a sign, but by virtue of the reflectivity, to determine its nature as well. Also, lane markings are more reflective than road surfaces, and so on.

This allows lidar to provide a view of the world independent of cameras and radar, although there will probably not be any attempt to replace them. There will always be a need for a redundant sensor modality as a backup in case the lidar system fails, to make sure that the vehicle is able to safely pilot itself or at least bring itself to a safe stop.

Software

I then asked Gopalan about software. He replied that as the speed of autonomous vehicles increases, the code processing unit inside these systems, which is a pretty big computer at the moment, really has precious little time to crunch all the data and make decisions. Because of that, a greater amount of the processing will be done in the sensor itself. Rather than just providing raw data, in the future, analytics embedded in the lidar will directly provide location information. Gopalan also believes that most problems will continue to be solved with traditional algorithms rather than artificial intelligence. Since algorithms are deterministic, if a problem occurs the cause can be more easily traced.

The Future

Autonomous vehicles and ADAS are poised to become standard technologies in the next few years, thus increasing the need for improved sensing technologies. Lidar is becoming increasingly sophisticated and will be a major partner in automotive sensing along with cameras and radar.

This article was written by Ed Brown, Associate Editor of Photonics & Imaging Technology. For more information, visit here .


Photonics & Imaging Technology Magazine

This article first appeared in the July, 2019 issue of Photonics & Imaging Technology Magazine.

Read more articles from this issue here.

Read more articles from the archives here.