DARPA’s Mind’s Eye Program aims to develop a smart camera surveillance system that can autonomously monitor a scene and report back human-readable text descriptions of activities that occur in the video. An important aspect is whether objects are brought into the scene, exchanged between persons, left behind, picked up, etc. While some objects can be detected with an object-specific recognizer, many others are not well suited for this type of approach. For example, a carried object may be too small relative to the resolution of the camera to be easily identifiable, or an unusual object, such as an improvised explosive device, may be too rare or unique in its appearance to have a dedicated recognizer. Hence, a generic object detection capability, which can locate objects without a specific model of what to look for, is used. This approach can detect objects even when partially occluded or overlapping with humans in the scene.
This work was done by Michael C. Burl, Russell L. Knight, and Kimberly K. Furuya of Caltech for NASA’s Jet Propulsion Laboratory.
The software used in this innovation is available for commercial licensing. Please contact Dan Broderick at This email address is being protected from spambots. You need JavaScript enabled to view it.. Refer to NPO-48851.
This Brief includes a Technical Support Package (TSP).

Detection of Carried and Dropped Objects in Surveillance Video
(reference NPO48851) is currently available for download from the TSP library.
Don't have an account? Sign up here.