This software implements a new probabilistic framework for integrated multi-target detection and tracking of small moving objects in image sequences, with specific application to tracking people in aerial images, in which image stabilization is inherently noisy.

This approach introduces a new method for integrating the detection and tracking steps. First, instead of using only image information for detection, the output of the multiple hypothesis tracker is used to increase the likelihood of targets existing at certain image locations. These likelihoods are then utilized, along with a model of the image background, to calculate the probabilities of each pixel belonging to an existing target, other foreground, or the background, in a Bayesian manner. These pixel-level probabilities both determine the detected measurements in the frame and influence the probabilities of hypothesized target tracks. This detection approach reduces missed detections to more robustly track objects and decreases sensitivity to user-selected thresholds on foreground.

The use of a multiple hypothesis tracking (MHT) framework makes both the detection and tracking more robust, as it propagates several possibilities for what targets exist in the images, delaying data association decisions until more data are collected. A novel MHT infrastructure is required because the data association hypotheses influence what measurements are detected in the image; the number and locations of measurements can vary across different nodes of the hypothesis tree. Therefore, the MHT was modified to allow for this variation and explicitly derive the new governing probability expressions for combined detection and tracking in this work.

This work was done by Michael T. Wolf, Adnan I. Ansar, and Curtis W. Padgett of Caltech for NASA’s Jet Propulsion Laboratory. For more information, contact This email address is being protected from spambots. You need JavaScript enabled to view it..

This software is available for commercial licensing. Please contact Dan Broderick at This email address is being protected from spambots. You need JavaScript enabled to view it.. Refer to NPO-48904.