There are five autonomous driving levels , beginning with the Level 1 single function controls like lane-assist and cruise control, and ending with Level 5 (no driver!)

In between the tiers, we find "Level 4," where fully autonomous vehicles also allow human operators to intervene when necessary.

With the help of sensors and cameras, many manufacturers are already developing Level-4 vehicles that operate within closed areas and at limited speeds. In 2018, for example, Alphabet's Waymo, began a self-driving taxi service in Phoenix, Arizona called "Waymo One."

But how safe are autonomous cars when their vision systems are obstructed by the elements?

In a live presentation on Tech Briefs titled The Path to High-Level Autonomy for Commercial Vehicles, a reader had the following question for Dirk Wohltmann, Engineering Director at the Germany-based autonomous-systems manufacturer ZF:

"How do you guarantee Level 4 autonomy when the vision systems are obstructed, like when there’s snow accumulation, ice, or mud?"

Read Wohltmann's edited response below.

Dirk Wohltmann: Clear vision is really essential. There is a lot of interesting work being done on surface protection or specific surface materials that are resistant against mud and dirt and snow, as well as cleaning solutions. There's really a need for it.

We always talk about how we can bring in redundancy here. Redundancy is very important, and the real beauty and creativity starts where we don't just add another radar, or another camera, or another camera and another lidar on top. We also look at how can we utilize information available in the system already to have the redundant fall-back solution for the sensor. That could be vehicle to vehicle communication — a factor that utilizes other sensors.

More importantly: the self-diagnostic, to ensure that the system has a very good predictive technology — not only when the system fails, or the sensor fails and then we react. We have to know it up front, to have the chance to replace or repair a sensor if something happens, before we even start the journey. Now you see more and more that you have combined suppliers who not only have the sensor; they also have the control and/or an actuator. You see that the suppliers are integrating everything in one house. That means the sensors can be better adjusted to the need of the control system.

We want to build the redundancy without just adding devices, more load to the CAN bus, and more load to the vehicle (and more cost). We want to make sensors smart and self-diagnostic, and more robust.

Have a question (or a comment) about autonomous driving levels? Share it below.