Micro aerial vehicles (MAVs) are lightweight, highly dynamic vehicles with limited payload, sensing, and computation capabilities. There is significant interest to automate MAVs for military surveillance, reconnaissance, and search-and-rescue missions. The current state of the art in MAV control utilizes an external system of cameras or other sensors to localize the vehicle during flight. Precise MAV localization using onboard computing and sensing resources is required for missions in unknown indoor and outdoor environments. In general MAVs may only operate using a low-performance inertial measurement unit and a single camera, with use of other sensors such as GPS or altimeters being limited by payload or environmental constraints.
A Feature and Pose Constrained Extended Kalman Filter (FPC-EKF) was developed that uses camera images and inertial measurements to provide onboard vehicle localization. The FPCEKF framework augments the vehicle’s state with previous vehicle poses and critical environmental features including vertical edges. This filter framework allows hundreds of opportunistic visual features to be tracked over several camera frames, while a few persistent features, including both points and vertical edges, are tracked for longer durations. In addition, vertical features in the environment are opportunistically used to provide global attitude references in the filter.
The key innovation of the FPC-EKF is that it tracks both image features (points and edges) and previous vehicle poses. Existing filters track only features or edges. The advantage of tracking both image features and vehicle poses in the filter is that it enables operation over a wider range of operation situations. The new filter is designed to accommodate both operations such as moving slowly, hovering, or ingress, where image features will be persistently observed, and operations such as fast traverses or aerial maneuvering, where camera images change rapidly.