Forward-facing cameras, integrated with vehicle controls, are being used to recognize pedestrians, signs, and other cars and motorcycles. Automatic brake mechanisms — often connected to a combination of radar, camera, and sensors — can halt a vehicle as it approaches an object ahead. New mounted cameras have the ability to register road markings and keep drivers within their own lanes.

Figure 1. Lane-keeping assist technology integrates a TRW forward-looking camera and electric power steering system to help the driver more intuitively move back into the center of the lane. (Image: TRW Automotive)
Volvo’s new luxury SUV, the XC90, for example, features a front-facing camera and radar in the upper part of the windscreen and behind the rear-view mirror. The imaging and radar capabilities prompt brakes to avoid a collision, or initiate steering wheel torque when a driver veers off course.

These types of automated safety features are examples of a technology called Advanced Driver Assistance Systems (ADAS). The ADAS mechanisms being placed into cameras, however, are moving beyond the luxury models and into a growing number of mainstream cars. As the migration occurs, OEMs will need to prove the robustness of the capabilities to new customers.

Camera Creep

Although many expensive imaging technologies — like infrared cameras for night vision and backseat monitoring — are still reserved for the high-end Mercedes, Lexus, and Cadillac, an increasing number of cameras and sensors are moving into more conventional cars. Automatic windshield wipers that sense rain, backup cameras, and automated headlight dimmers, for example, have become common features in today’s vehicles.

“We’re seeing cameras creep down in the more mass-produced vehicles because quite frankly it’s getting more affordable for car manufacturers to do so,” said Alex Shikany, Director of Market Analysis for the Ann Arbor, MI-based Association for Advancing Automation (A3) research firm.

Figure 2. Blind Spot Detection (BSD) technology warns the driver when there are vehicles in the blind spot of the side-view mirror. (Image: Continental Corp.)
The safety features, Shikany said, are becoming more familiar and expected by consumers, often because of the heavy media coverage on safety issues and recalls.

“People want to be safe in their vehicles,” Shikany said. “I would say that the average consumer tilts more toward the safety expectations than the bells and whistles of night vision or a driver recording system. Those are more niche.”

Automotive standards and safety ratings have also pushed ADAS imaging technology into a greater number of vehicles, as manufacturers try for five-star scores. Euro NCAP Advanced Standards, for example, include ratings for driver assistance features like blind spot monitoring, lane support systems, speed alert systems, and autonomous emergency braking.

“These kinds of ratings, like the Euro NCap, are driving the OEMs to introduce these features in pretty much all of their cars,” said Shikany, “Not only, let’s say, the expensive ones.”

Staying in the Lane

Lane support is one ADAS technology increasingly finding a place inside vehicles. Traditional lane-departure warnings alert the driver with an audible buzz or rumble in the seat. TRW Automotive, the Livonia, MI-based vehicle safety supplier, has developed a more “active-assist” approach: lane-keeping steering technology that uses a frontward-facing camera, placed in the rear-view mirror mount, to pick up lane markings in the road.

The information is then sent to an electric power steering system, which automatically torques the car in either direction to send drivers back to the center of the lane (see Figure 1). John Wilkerson, Senior Communications Manager at TRW, said that users tend to like the “active-assist” aspect of the technology, rather than the pure warnings of previous technologies.

“You may get a warning, and you don’t realize intuitively what you want to do right away,” Wilkerson said. “So an assist system tends to help you understand what it is that you need to do.”

TRW’s lane-keeping assist system was introduced on the Lancia Delta car in 2008. The technology, Wilkerson said, will be integrated with Chrysler vehicles, including the 2014 Jeep Cherokee, and GM pickup trucks, demonstrating a shift of the technology into more mainstream platforms.

“This’ll really continue to be a trend, and we’re going to see a lot more penetration in the normal vehicle range than we have in the past,” said Wilkerson.

Proving ADAS “Robustness”

Figure 3. The HMEye Cockpit Concept uses eye gaze direction data, along with head direction and other image attributes, and combines it with advanced steering wheel controls. (Image: Visteon Corp.)
OEMs and manufacturers still must ensure that the ADAS products are capable, while not annoying the driver. To address customer expectations and drivers who are new to the technology, Christian Schumacher, head of the Advanced Driver Assistance Business Unit at the Hanover, Germany-based automotive supplier Continental Corporation, emphasized the importance of what he calls “robustness,” meaning a safety feature is only active in use cases and does not warn drivers when there is not a hazard.

Lane departure warning technology, for example, is a mechanism whose alerts may annoy or confuse drivers initially, he said, and Continental is focusing on ways to improve the human-machine interface to provide better support for the driver.

If a driver intentionally crosses a lane marker, for example, or does so because he or she wants to drive in a more sporty fashion, that driver often does not want to be alerted of the line crossing, and may disable the feature by a switch — an obvious problem for a safety technology.

“That’s where we really have to focus: to get the consumer acceptance to make these features better, because lane departure warning is such a tremendous, important feature,” said Schumacher. “You just have to make sure that the reputation it gets will be improved.”

Schumacher emphasized the importance of helping drivers understand the features, and he sees a positive example with Continental’s Blind Spot Detection (BSD), a system that places radar sensors in the left and right corners of the car, monitoring any nearby vehicles and warning drivers as they change lanes. In most of the applications, the function is carried out by displaying a small icon in the outside mirror (see Figure 2).

“At the beginning, people were not necessarily demanding this system as they went into dealerships. They buy the car, and sometimes it has BSD,” said Schumacher, “but we see a significantly high rate of people that are saying: if they once had the feature, they want it again. We see more acceptance.”

Opportunities to Integrate

To manage consumer skepticism and to address the challenge of convincing mainstream consumers that the automated system works, imaging technologies can be used to improve the ADAS tools and cut down on drawbacks like false positives.

Visteon Corporation, a Van Buren Township, MI-based automotive supplier, displayed its HMEye demonstration at the 2014 International CES conference, the popular consumer electronics and technology trade show. The HMEye utilizes mounted infrared cameras to monitor eye gaze and head position, primarily as a human-machine interface control mechanism and a way for a driver to make menu selections using only eye movement.

Based on a quick glance, the cameras detect which icons from a display menu that a driver is looking at, such as navigation, audio, and climate controls. If the driver wishes to select a highlighted icon, he or she can click a button on the steering wheel to activate that mode.

Tying a system like HMEye to a forward-looking collision avoidance feature opens up opportunities to further improve driver safety and ADAS accuracy. Though not shown in the CES demo, the camera systems can use that same data to spot signs of a drowsy or distracted driver: if one’s eyes are off the road or if the driver is blinking rapidly, for example (see Figure 3). By combining the two setups, the integrated technology could simultaneously monitor a driver’s facial motions while detecting, say, a parked or stopped car up ahead.

“ADAS systems are great, but it’s the false warnings that make people sort of tune out from the warning messages of those systems. If you couple that into driver monitoring systems, where the camera is actually looking at the driver and knows what they’re doing, you can actually get it to the point where the system will tell you when you need to know. That’s a really big improvement across the board,” said Upton Bowden, Electronics Marketing and Portfolio Planning Manager for Visteon.

Most of today’s ADAS systems are discrete and separate, according to Dave DeCoste, Senior Manager, Marketing and Communications at Visteon: one system uses one camera to process for one feature. As the migration and number of cameras, sensors, and car-to-car connectivity capabilities increase in vehicles, he sees more opportunities to bring technologies together.

“I think over the next five years what you’re going to see is a lot of integration of those systems. So you’ll have one more powerful multicore processor that is pulling in numerous feeds and compiling them and pulling out the necessary data, and that gets fed over to whatever mechanism notifies the driver of threats,” said DeCoste.

ADAS on the Mind

As the Advanced Driver Assistance Systems work their way into the more mainstream automotive market, the imaging technology looks to be increasingly important to consumers and manufacturers alike. Features like blind spot detection and collision avoidance sensors have become more common in today’s vehicles. Safety ratings on cars, a major selling point for automotive OEMs, may also rely on emerging ADAS imaging technologies, according to analysts, including Alex Shikany.

“I think companies are going to realize that putting in these vision technology systems, maybe even making some of them more standardized, is going to help their image, and it’s also going to help them sell more vehicles, because it’s on the top of people’s minds right now,” said Shikany.

This article was written by Billy Hurley, Associate Editor, NASA Tech Briefs. Contact This email address is being protected from spambots. You need JavaScript enabled to view it. for questions or more information.


Imaging Technology Magazine

This article first appeared in the September, 2014 issue of Imaging Technology Magazine.

Read more articles from this issue here.

Read more articles from the archives here.