A self-driving car contains a multitude of sensors, including detectors of lanes, traffic signs, and free space. The advanced driver-assistance system, or ADAS, components must be evaluated in a range of conditions before officially getting on the road.

One safe way to go for a test drive: recreating the world in simulation. You can go for a ride from your desk, says Warren Ahner, CEO of the San Jose, CA-based simulation software provider RightHook.

"Nobody dies in the metaverse, at least not yet," Ahner told an audience this month in a Tech Briefs-led presentation titled Latest Enhancements for ADAS Sensor Testing and Simulation. "It makes it really easy to explore those edge cases you might fund impossible to locate in the real world."

Companies like the San Jose, CA-based simulation software provider RightHook build their virtual environments based on well-understood physical phenomena, as well as an assist from scanning and mapping companies that make digital replicas of the real world.

Warren Ahner, CEO, RightHook

But some natural elements are difficult to recreate. What if you're driving through rough weather, for example?

During the live presentation, a reader had the following question for Ahner:

"For vehicle simulation software, how do you compensate for the interaction of rain or snow storms on lidar sensors?

Warren Ahner: Precipitation such as rain and snow can definitely reduce the performance of a lidar sensor. They are particularly affected due to the beam divergence and the short pulse duration. Snow detection noise is really concentrated near that sensor and manifests itself in these kind of large clusters.

What we see is snow cluster noise really introduces false detection, as well as obscures important obstacles, leading to some really, really critical false negative situations, and you really get into that once you start approaching this 3/4"-of-snow accumulation an hour.

If you're below that [rate], the noise is there, but it does't become critical false-negative territory, so first and foremost, this is one of the areas where we do start using empirical data.

We could do an individual photon particle simulation, but you're talking about a huge, huge computational situation that would probably get you down to a 30-second simulation that would probably take a few hours to calculate. We apply an empirical data model over the top of it. We apply data that we've collected, or some of our research partners in academic and government have collected, "over top" of our sensor model, so you'd specify, say, the rate of snow and a few other characteristics of that weather situation, and we would put that in as a noise model over top of the original data.

That said,.yesterday Cruise had its very first autonomous taxi ride with no safety operator . They're operating in San Francisco, which never sees snow, and they don't run their vehicles in the rain. We're still working through this with our customers and partners to build a better and better solution. Right now, it's not the really the problem we have to focus on solving too heavily, because no one is there yet.