Real-time 3D lidar is poised to be the third leg of the trifecta of sensor technologies enabling both advanced driver-assistance (ADAS) and autonomous vehicles. The other two pieces are cameras and radar. David Hall, CEO of Velodyne Lidar, Inc., invented the HDL-64 Solid-State Hybrid real-time 3D lidar sensor in 2007 and Velodyne has continued to develop lidar systems for the automotive market ever since. I discussed the background of the technology and its current and future prospects with CTO, Anand Gopalan.

Figure 1. Retirement communities are a potential first location for truly driverless cars.

Lidar was developed in the 60s, shortly after the invention of the laser. It measures distance by illuminating a target with a pulsed laser beam and measuring the return time or phase shift of the reflected pulse.

The development of real-time 3D lidar was a major step forward for the technology. Initially it was just used to create human-readable high-definition maps of the world. At some point however, Google, among others, realized that once you have this high definition map, you could use it for a variety of applications, such as autonomous driving. So, mapping and autonomy are very intricately linked. Autonomous vehicles are not just sensing, and avoiding, and reacting, to the environment, they're actually navigating through the world using a high definition map.

Where is 3D Automotive Lidar Now — And Where is it Heading?

There are two types of automotive applications for lidar. The first is for developing fully autonomous vehicles — SAE level 4 and 5 systems. We will probably initially see them in mobility-on-demand (MOD) fleets, which are operated by ride-share companies like Uber or Lyft. Even some OEMs have expressed the desire to have their own autonomous vehicle ride-share services.

The other application, which is attracting a lot of interest, is for advanced driver assist systems (ADAS). Although initially these systems performed simple functions like emergency braking, lane-keep assist, and blind spot protection, they are moving toward more advanced features, such as highway autopilot and a limited set of autonomous features. According to Gopalan, lidar is necessary to achieve good level 2+ and level 3 ADAS systems. Research and development fleets have been on the road since 2017 and Gopalan expects that by 2020 or 2021 we will see production of consumer cars with lidar integrated into ADAS systems.

For the mobility on demand, fully autonomous vehicle fleet markets, Gopalan expects a move from small fleets of a few tens of vehicles to thousands in the next two to three years.

Why Ride-Share Fleets?

I asked Gopalan why the initial market for autonomous vehicles will be for ride-share and mobility-on-demand fleets. He answered that there are a couple of different ways to think about it.

First, there is the economics and the business case. Especially in dense urban areas, ride-sharing is becoming well accepted. The cost of ownership of an autonomous vehicle, even an economical one, is high. However, with a ride-sharing service, the cost of ownership per mile dips below the total cost of ownership, especially in an urban environment.

For average consumers, it is becoming increasingly attractive to subscribe to an autonomous ride sharing service where you can have access to a wide variety of vehicles depending upon your needs, and they will be available at your beck and call. The average person uses a vehicle maybe three or four hours of the day, so in this model, for the rest of the time, the vehicle would be available for other people to use.

From a technology perspective, fleets can be maintained far more regularly — the vehicles can even be serviced on a daily basis if needed. This allows OEMs and ride-share operators to have much better control to make sure that their autonomous vehicles are functioning up to specifications.

I then asked about the safety of deploying an autonomous vehicle in a crowded urban environment. “This is obviously a problem. A lot of the autonomous vehicle technology companies, such as Waymo, Cruise, and Uber, have been working on it for a few years,” said Gopalan. The key metric for determining the success of the technology, at least in the current and short term, is actually the speed of driving. Urban areas, even though they are denser and more complicated, tend to be lower speed environments. In the range of less than 40 miles per hour, those environments tend to be easier for autonomous systems to resolve than in a very high-speed environment like a highway. “So, I think that is the reason you're seeing a lot of the players focusing on low-speed urban environments, whether it's Waymo deploying in the streets of Phoenix, or Cruise testing in San Francisco, or Uber testing in Pittsburgh,” he said.

What Lidar Brings to the Sensor Trifecta

Figure 2. A lot of the players are focusing on low-speed urban environments, whether it's Waymo deploying in the streets of Phoenix, or Cruise testing in San Francisco, or Uber testing in Pittsburgh.

I asked Gopalan what 3D lidar can do that cameras and radar can't? “Cameras are relatively inexpensive and have high resolution, but they are a passive sensing technology. Since they are sensitive to ambient light, they don't perform uniformly for changing conditions, especially dawn to dusk to nighttime. So, since their performance is limited, you get a lot of variability in the results,” he said. Also, with a stereo camera system you get derived, rather than direct, information. With standard stereo, there are two cameras spaced apart at a certain distance. Then based on these 2D images and the angles between them, you derive 3-D, information. The problem is that as you go further out in distance, small errors in the stereo camera system will lead to large errors in the measured distance because the angles are so extreme.

“Radar, on the other hand, is what I call an active sensor,” he said. It senses the environment by sending and receiving electromagnetic waves. Since it is not sensitive to ambient light, it works well across any weather conditions. However, radar has limited resolution because of its wavelengths.

“Lidar is a really interesting technology because it sits in between the two other sensor modalities,” said Gopalan. Since it's a light-based technology, it has much higher resolution than radar and can see colors, reflectivity, and lane markings. It can sense context similar to a camera, but since it brings its own light to the party, it's not at all sensitive to ambient light conditions — it basically works the same whether it's dawn or dusk or night or daytime.

Lidar is also a direct, rather than a derived measurement of distance. It sends out a laser pulse and measures precisely when the pulse returns. Since the speed of light is constant, this is a direct and active measurement. The resolution is only limited by the wavelength of the light being used, which in the case of automotive lidar is the near infrared, with wave lengths in the range of hundreds of nanometers. That means you can easily resolve distances in the range of a millimeter or two. It's a very precise distance measurement, regardless of ambient light conditions or environmental noise.