Automotive manufacturers working on self-driving cars have the choice of three main sensors: cameras, radar, and LiDAR.

When tied to a computing system, each sensor can support the Advanced Driver Assistance Systems (ADAS) that allow a vehicle to operate autonomously in an environment.

But which sensor works best?

In a Tech Briefs live presentation titled LiDAR Technology Advancements and Market Trends, Jordan Greene, co-founder of the Dublin, CA-based automotive sensor company AEye, answered the following questions from an attendee:

What advantages are there still in lidar imaging that could counter regular video cameras? What are the cons and pros between using LiDAR vs. radar vs. ADAS?

Read Greene's edited response below.

Jordan Greene: There are a number of different sensors that will always be prevalent in autonomous vehicle business-model challenges. There are the three that you mention — cameras, radar, and LiDAR — and a slew of others. The interesting thing is that it's spectrum of value that you get. Every sensor has their pros and cons.

headshot of Jordan Greene from AEye
Jordan Greene, co-founder of AEye

Cameras

Cameras, for example, have the "pros" of being able to detect RGB information. They also have the benefit of having extremely high resolution. But they have the disadvantage of the fact that sunlight could blind it. Contrast is also an issue, and depth information is not available. There are those fundamental limitations of camera technology.

Radar

Also commonly also associated with cameras is radar. Radar has the benefit of being able to detect very well through bad weather, but it also has limitations: Radar does have range information, but it does not good have resolution at range.

LiDAR

Counter to both of those, LiDAR is the only sensor that gives you resolution at range: the ability to get very fine and very accurate detection of objects in space. The weakness of LiDAR on the other hand is that it does not have the comparable amount of resolution of a 2D camera and it does not have the ability to see through bad weather as well as radar does.

So, the three them together are uniquely greater than the sum of their parts separately. We don't think one technology will win. It will have to be all three of them, plus others. This is similar to how you typically don't operate with just one sensor in your body. You have smell, you have touch, you have sound, you have eyesight, and you bring them all together in a very "edge processing" kind of manner.

What do you think? Share your questions and comments below.