Although a NASA-built sensor being tested in the Mojave Desert this summer will be used to support the safe landing of rovers on Mars, one of the technology's lead researchers sees other commercial possibilities. Principal Investigator Farzin Amzajerdian and his team at NASA's Langley Research Center developed the Navigation Doppler Lidar, a bread-box-sized sensor system that could someday play a role in aircraft navigation and even driverless car applications.


Bruce Barnes, who specializes in electronics engineering and system integration for the Navigation Doppler Lidar, makes final preparations to the sensor in a lab at NASA's Langley Research Center. (Credit: NASA/David C. Bowman)

The Mars Curiosity rover reached the surface of Mars in August of 2013. The rover stuck the landing, thanks in part to NASA's Terminal Descent Sensor and inertial measurement units that provided altitude and velocity data. TDS's six narrow-beam antennas enabled a precise, soft touchdown.

NASA's Navigation Doppler Lidar performs a similar function to the Terminal Descent Sensor, providing the direction of the vehicle and its descent speed. The NDL also has multiple antennas: three laser beams propagating in slightly different directions from each other toward the ground.

Using a technique called frequency modulated continuous wave (FMCW), the lidar transmits three individual laser beams and measures line-of-sight range and velocity along each beam. The three velocity measurements are combined to determine the vehicle velocity vector, such as speed and direction. The use of multiple beams to compute altitude also reduces the inaccuracies caused by land obstacles, such as boulders and craters, when compared to a single-beam radar or lidar altimeter.

The biggest difference between the NDL and TDS is each sensor's transmitted frequency. Using a near-infrared laser, the NDL has a frequency that is more than three orders of magnitude higher than state-of-the-art radars, said Amzajerdian. The increased laser frequency translates to higher measurement precision and a much less divergent beam.

Because the beams are confined or collimated, as opposed to radar's deviating radiation, there is little effect of clutter regarding the measurements. The calculations are more precise and have a higher quality, meaning very few false alarms.

“We are essentially transmitting a pencil beam, which is not going to be obstructed by the vehicle's structures or affected by the features of the terrain,” said Amzajerdian.

Cheaper and Lighter

The Navigation Doppler Lidar's reduced mass and size also supports NASA's continuous efforts to reduce the size and mass of its space-bound cargo.

The NDL consists of an electronics chassis and an optical head with three 2” lenses. The 8 × 9 × 11” chassis and optical head is approximately 13 kg — almost half the weight of the TDS landing technology, a 4’-long plate with protruding electronics and large antennas.

“The weight of the optical head can be further reduced if we use smaller lenses,” said Amzajerdian. “But that means shorter operational range or altitude.”

Additionally, because the NDL relies on fiber optic components, detectors, the laser, and other standard parts used by the telecom industry, the technology can be developed at a low cost, he added.

The Navigation Doppler Lidar will be flight tested aboard a rocket-powered test vehicle named Xodiac, created by the Mojave, CA-based aerospace manufacturer Masten Space Systems. In the closed-loop flight test, planned for this summer, the lidar data will be used to navigate the vehicle. An open-loop test is set to begin in early spring, in which the NDL will not operate as part of the vehicle guidance system.

The NDL will be assessed as part of the NASA payload called COBALT, or CoOperative Blending of Autonomous Landing Technologies, a joint technology development effort among multiple NASA centers. The payload includes the Lander Vision System (LVS), a JPL-built sensor that provides accurate position data.

The NDL (shown) is a small electronics box connected, by fiber optic cables, to three lenses. The lenses transmit three laser beams. (Credit: NASA/David C. Bowman)

“The Lander Vision System will tell the vehicle where it is, relative to where it is supposed to go,” said Amzajerdian. “The NDL provides the navigation data for the lander to navigate to that location.”

Mars and Earth Applications

Beyond its main role of landing rovers safely and precisely on Mars, Amzajerdian sees the lidar sensor being used eventually to enable terrestrial applications where GPS is unable to supply surface-relative altitude and velocity data. Helicopters, for example, could potentially navigate more precisely and land safely when a pilot's visibility is impaired by blowing dust.

“In those types of situations, having good information about velocity and the altitude relative to the local ground gives some options for the pilot to be able to land in degraded visual environments,” said Amzajerdian.

Autonomous vehicles like Google's self-driving car require a scanning lidar that measures distances and generates 3D images of surroundings. Amzajerdian also sees the NDL as a way of someday providing similar visual-sensing capabilities for driverless cars, such as measuring the velocity of nearby objects and pedestrians, and calculating their distance from the vehicle.

There are hurdles, of course, to transferring a Mars-bound device to the automotive industry. NASA sensors are bulky, expensive, and currently impractical for automakers. Like any other technology, Amzajerdian said, making the technology mature, efficient, and cheaper will take time.

For now, the focus is on bringing the Navigation Doppler Lidar to Mars. If all goes according to the plan, Amzajerdian and his team expect to see the NDL on landers by 2021.

This article was written by Billy Hurley, Associate Editor, NASA Tech Briefs. To submit comments and questions, email This email address is being protected from spambots. You need JavaScript enabled to view it..

Sensor Technology Magazine

This article first appeared in the March, 2017 issue of Sensor Technology Magazine.

Read more articles from this issue here.

Read more articles from the archives here.