2012

This technology enables a micro aerial vehicle to transition autonomously between indoor and outdoor environments via windows and doors based on monocular vision.

Micro aerial vehicles have limited sensor suites and computational power. For reconnaissance tasks and to conserve energy, these systems need the ability to autonomously land at vantage points or enter buildings (ingress). But for autonomous navigation, information is needed to identify and guide the vehicle to the target. Vision algorithms can provide egomotion estimation and target detection using input from cameras that are easy to include in miniature systems.

Target detection based on visual feature tracking and planar homography decomposition is used to identify a target for automated landing or building ingress, and to produce 3D waypoints to locate the navigation target. The vehicle control algorithm collects these waypoints and estimates the accurate target position to perform automated maneuvers for autonomous landing or building ingress.

Systems that are deployed outdoors can overcome this issue by using GPS data for pose recovery, but this is not an option for systems operating in deep space or indoors. To cope with this issue, a system was developed on a small unmanned aerial vehicle (UAV) platform with a minimal sensor suite that can operate using only onboard re sources to autonomously achieve basic navigation tasks. As a first step towards this goal, a navigation approach was developed that visually detects and reconstructs the position of navigation targets, but depends on an external VICON tracking system to regain scale and for closed-loop control.

A method was demonstrated of visionaided autonomous navigation of a micro aerial vehicle with a single monocular camera, considering two different example applications in urban environments: autonomous landing on an elevated surface and automated building ingress. The method requires no special preparation (labels or markers) of the landing or ingress locations. Rather, leveraging the planar character of urban structure, the vision system uses a planar homography decomposition to detect navigation targets and produce approach waypoints as an input to the vehicle control algorithm. Scale recovery is achieved using motion capture data. A real-time implementation running onboard a micro aerial vehicle was demonstrated in experimental runs. The system is able to generate highly accurate target waypoints. Using a threestage control scheme, one is able to autonomously detect, approach, and land on an elevated landing surface that is only slightly larger than the footprint of the aerial vehicles, and gather navigation target waypoints for building ingress. All algorithms run onboard the vehicles.

This work was done by Roland Brockers, Jeremy C. Ma, and Larry H. Matthies of Caltech; and Patrick Bouffard of the University of California, Berkeley for NASA’s Jet Propulsion Laboratory. For more information, contact This email address is being protected from spambots. You need JavaScript enabled to view it.. NPO-47841

White Papers

Pedal Position Sensing in Heavy-Duty Vehicles
Sponsored by Honeywell
The Economics of Accuracy in Low-cost, High-volume Sensing Applications
Sponsored by Honeywell
From The Design Lab: An Insider’s Guide To Laser Sintering
Sponsored by Stratasys Direct Manufacturing
Epoxy-based Hermetic Feedthroughs Boost Switchgear Reliability
Sponsored by Douglas Electrical Components
Potting Compounds Protect Electronic Circuits
Sponsored by Master Bond
Evaluating Electrically Insulating Epoxies
Sponsored by Master Bond

White Papers Sponsored By:

The U.S. Government does not endorse any commercial product, process, or activity identified on this web site.