Two decades have passed since automotive manufacturers began using the first microelectromechanical systems (MEMS) accelerometer to measure strong acceleration and trigger the deployment of airbags (see Figure 1). The inaugural inertial sensor paved the way for more widespread use of accelerometers in today’s advanced driver assistance systems (ADAS).

Figure 1. The first accelerometer for automotive airbags, pioneered at Analog Devices. (Image Credit: Analog Devices)

Present ADAS technologies also incorporate other types of MEMS inertial sensors, including gyroscopes, pressure sensors, and magnetometers. In fact, the much-beloved SUV would not exist if not for the rollover safety features made possible by MEMS. A MEMS gyroscope detects rotation around the x-axis of the rolling car — the primary input for the crash detection algorithm.

While inertial sensors play a prominent role in automated driving, the components enable equally important ADAS applications, which are either here today or will arrive soon. What do engineers need to know about MEMS inertial sensors when designing ADAS, and what do these technologies mean for automotive manufacturers and consumers in the next 10-20 years? Let’s review the role of sensors in both present and future automotive technologies.

Rollover Sensing

Rollover sensing, a passive vehicle safety function, detects whether a car is falling over and activates the deployment of airbags. Inertial sensors provide the primary feedback (roll rate, lateral and vertical acceleration) for crash detection algorithms.

The challenge, however, is to provide reliable sensor signals in a variety of conditions: under extreme heat or cold temperatures, on motorways, or on gravel roads, for example. The requirement also applies to inertial sensors for electronic stability control (ESC), an active automotive safety feature that supports the avoidance of skidding by controlling and actuating car brakes.

Figure 2. Bosch began testing automated driving on public roads at the beginning of 2013. The latest test vehicles are based on the Tesla Model S. (Image Credit: Robert Bosch GmbH)

One approach to the challenge is a careful design that combines the expertise of MEMS design and the understanding of automotive systems and their requirements. Products must be designed according to specifications, and samples must first be tested in lab and comply with paper-written requirements. Finally, the sensors need to undergo more real-world test drives, such as rides during winter or on gravel roads.

Navigating through Urban Canyons Drivers have embraced in-dash navigation systems; the technologies lower the stress of self-navigation in unfamiliar cities. Relying on maps, Global Navigation Satellite System (GNSS) signals, and routing algorithms, inertial navigation systems may even provide realtime information on traffic jams via connectivity services, i.e., the traffic jam assist feature.

Automotive engineers favor adding inertial sensors to navigation systems because their system will still work in “urban canyons,” or areas where GNSS signal is of poor quality, fails, or is unavailable. In such situations, inertial sensors can determine the change in position after the last trustworthy GNSS reading. If GPS signal cannot be received when a driver is inside a tunnel, for example, the inertial sensor calculates a vehicle’s direction in meters. Dead reckoning algorithms then calculate the position change; one’s current position can be extrapolated based on the inertial sensors’ signals.

Driver Assistance in Many Flavors Driver assistance technologies are more than simply cruise control or rear backup cameras. Adaptive cruise control; lane keeping and lane changing assist; advanced emergency braking systems (AEBS); and active front steering are all variations of driver assistance — and made possible by the intelligent fusion of MEMS inertial sensors with perception systems such as cameras, radar, and/or LIDAR.

Adaptive cruise control is far more experience-specific than the familiar, traditional cruise control features. While the old cruise control technology conserves gas and may prove more relaxing on long drives, who hasn’t experienced the annoying need to toggle the cruise control off and on, depending on the variable speeds of a nearby car? Rather than maintaining a single speed when on cruise, adaptive cruise control adjusts the vehicle speed as needed, in order to maintain a safe distance from other cars.

Figure 3. MEMS inertial sensors are integral to localization and navigation. (Image Credit: Robert Bosch GmbH)

Adaptive cruise control depends primarily on measuring distances to objects by using radar, cameras, or lasers. The same kind of inertial sensor that reinforces ESC also enables adaptive cruise control. The inertial sensor helps to predict a trajectory and then relate that route to the obstacle detection.

A similar inertial device also supports hill-hold control, a feature that keeps an uphill-driving vehicle from rolling backwards. A low-g sensor determines inclination by using the downward direction of gravity.

Active steering, another driver assistance technology, reduces at higher speeds the amount of change in the steering angle for every movement of the wheel. The feature supports more precise driving on highways. Yaw rate sensors provide the relevant information about sudden changes to motion.

The good news is that some driver assistance systems are already available in mid-priced vehicles, rather than just luxury cars. While BMW was early to market with active steering, Ford offers active steering on its Ford Edge; other automakers will soon follow suit.

Much in the same way that inertial sensors support cameras, radars, and lasers for driver assistance, the detection technologies can leverage automated driving by predicting the motion of the car.

Look Ma, No Hands!

Current autopilots combine a series of ADAS functions that already exist. Fully autonomous cars need to know their environment in detail and must interpret and predict the behavior of cars and pedestrians. Using high-precision maps and vision systems, perception technology must anticipate car trajectories on highways. Highway driving predictions are easier to achieve than anticipating car and pedestrian trajectories in urban driving (see Figure 2). Such artificial intelligence by “deep learning” is imperative for achieving the cognition required for fully autonomous cars.

Localization and Navigation

Figure 4. Automated driving will be implemented in stages. (Image Credit: Robert Bosch GmbH)

In fully autonomous driving, the car becomes a robot that answers the questions: “Where am I?”, “Where do I want to go?”, and “How am I going to achieve that?” Inch-scale localization, which answers those questions, is essential to automated driving and autonomous vehicles. In contrast to the navigation that directs us to the nearest Starbucks, localization pinpoints one’s position within the lane of a street (see Figure 3).

Inside the self-driving car, two different technological approaches converge for self-localization: robotics and transportation.

Using perception systems like cameras, LIDAR, and radar, robotics researchers have developed new methods to determine one’s relative position to objects. For example, when following the approach of simultaneous localization and mapping (SLAM), the robotic car creates a map of its surrounding environment and relates its actual position to this relative outline. Using pronounced landmarks from the map, and identifying their position in a stored high-precision plot, one’s absolute position can be determined.

Field-proven in the transportation industry, inertial navigation systems (INS) determine change in absolute position by measuring accelerations and rotations. Starting from an absolute position — which the system can deduce from GNSS readings, landmark navigation, or SLAM — the strapdown algorithm calculates a new position based on the readings of the inertial sensor.

Depending on the targeted accuracy, INS may require high-performance sensors, since sensor drifts and errors accumulate quickly. Highest demands are met by optical sensors, such as ring laser gyros and fiber-optical gyros. In recent years, high-performance MEMS sensors have successfully entered the market of tactical-grade sensors.

Fusing the Data of Inertial Sensors and Perception Sensors

How do vision and perception systems benefit from inertial sensors? Visual, or perception, sensors perceive moving objects — “the optical flow” — in order to reliably determine “structure from motion,” and to establish an estimate of car motion and of distance to traffic partners.

Inertial sensors are completely independent of a perceiving sensor’s limiting factors, such as weather conditions, suitable daylight conditions, snowy roads, or obscured landmarks. The inertial sensors do not depend on a scene’s illumination because they detect kinesthetic motion and do not compute it from pictures. Additionally, the more secure inertial sensors do not rely on any connectivity and data communication external to the car. Current research discusses both a loose coupling and a tight coupling for fusing kinesthetic inertial and visual information.

Figure 5. Autonomous Driving Market Shares. (Credit: “Revolution in the Driver’s Seat: The Road to Autonomous Vehicles,” The Boston Consulting Group, April 2015)

When employing a loose coupling, both the perception system and the INS will localize the car almost independently, and will mutually compare and correct their results afterwards. Tight coupling of inertial and visual sensor offers a second option, where direct (pixel-level) visual measurements of objects are combined with inertial measurement unit (IMU) readings.

In both approaches, the MEMS inertial sensor improves the capability of the perception system to follow objects from frame to frame, which can result in improved accuracy of localization.

Where Do We Go from Here?

On the evolutionary pathway of automated driving functions, driver assistance systems — such as lane keeping and lane changing assist, AEBS, and active front steering — will become more commonplace. Partially automated functions, such as traffic jam assist, are already in the market. Traffic jam assist will gradually expand over the next few years, with higher levels of automated functions soon to follow (see Figure 4).

By the end of this decade, expect to see fully automated driving on highways. Fully automated driving in urban areas, however, will probably take another 10 or 15 years to achieve (see Figure 5).

With such rich technologies at their disposal, automakers will continue to satisfy consumer demand for more widespread implementation of ADAS well before fully automated, and even partially automated, driving functions reach the majority of consumers. While fully automated driving may take years to accomplish, we are already benefiting from MEMS and sensors-enabled ADAS in the family car.

This article was written by Karen Lightman, executive director, MEMS & Sensors Industry Group, and Peter Spoden, product manager, inertial sensors, Automotive Electronics, Robert Bosch GmbH (Reutlingen, Germany). For more information, Click Here .