Synthetic Foveal imaging Technology (SyFT) is an emerging discipline of image capture and image-data processing that offers the prospect of greatly increased capabilities for real-time processing of large, high-resolution images (including mosaic images) for such purposes as automated recognition and tracking of moving objects of interest. SyFT offers a solution to the image-data-processing problem arising from the proposed development of gigapixel mosaic focal-plane image-detector assemblies for very wide field-of-view imaging with high resolution for detecting and tracking sparse objects or events within narrow subfields of view. In order to identify and track the objects or events without the means of dynamic adaptation to be afforded by SyFT, it would be necessary to post-process data from an image-data space consisting of terabytes of data. Such post-processing would be time-consuming and, as a consequence, could result in missing significant events that could not be observed at all due to the time evolution of such events or could not be observed at required levels of fidelity without such real-time adaptations as adjusting focal-plane operating conditions or aiming of the focal plane in different directions to track such events.

A Mosaic Imaging System According to SyFT would be built from “smart” imager cells, each of which would contain a focal-plane image sensor and a reprogrammable controller.

The basic concept of foveal imaging is straightforward: In imitation of a natural eye, a foveal-vision image sensor is designed to offer higher resolution in a small region of interest (ROI) within its field of view. Foveal vision reduces the amount of unwanted information that must be transferred from the image sensor to external image-data-processing circuitry. The aforementioned basic concept is not new in itself: indeed, image sensors based on these concepts have been described in several previous NASA Tech Briefs articles. Active-pixel integrated-circuit image sensors that can be programmed in real time to effect foveal artificial vision on demand are one such example. What is new in SyFT is a synergistic combination of recent ad vances in foveal imaging, computing, and related fields, along with a generalization of the basic foveal-vision concept to admit a synthetic fovea that is not restricted to one contiguous region of an image.

The figure depicts a mesh-connected SyFT architecture as applied to a focal-plane mosaic of homogeneous or hetero geneous image sensors. The architecture provides a networked array of reprogrammable controllers for autonomous low-level control with on-the-fly processing of image data from individual image sensors. Each image sensor in the mosaic focal plane is mapped to one of the controllers so that taken together the reprogrammable controllers constitute a conceptual (though not necessarily a geometric) image-processing plane corresponding to the mosaic focal plane. The controllers can be made versatile enough to control and to process pixel data from both charged-coupled-device (CCD) and complementary metal oxide/semiconductor (CMOS) image sensors in the mosaic focal plane. The image sensors can also have multiple pixel data outputs where each output has dedicated processing circuitry in its associated controller to achieve high throughput with real-time processing for feature detection and processing.

Each controller includes a routing processor to implement the network protocol and define the network topology for real-time transfer of raw pixel data and processed results between controllers. The network protocol and the capability to implement it are essential to realization of the capability for synthetic foveal imaging across the entire mosaic focal plane. The processing and networking capabilities of the controllers will enable real-time access to data from multiple image sensors, with application-level control of one or more ROI(s) within the mosaic focal plane array for sharing of detected data features among controllers. These capabilities will effectively facilitate the equivalent of rewiring and reconfiguration with different sensors in the mosaic, with scalability to different mosaic sizes dictated by application requirements. Consequently, the mosaic focal plane is treated as an integrated ensemble of synthetic foveal regions that can traverse the entire mosaic for autonomous intelligent feature detection and tracking capability. Unlike the current state-of-the-art in image sensors, “SyFTing” enables intelligent viewing through vast amounts of image data by treating a mosaic focal plane of sensors as an integrated ensemble rather than a collection of isolated sensors.

This work was done by Michael Hoenk, Steve Monacos, and Shouleh Nikzad of Caltech for NASA’s Jet Propulsion Laboratory. For more information, download the Technical Support Package (free white paper) at www.techbriefs.com/tsp under the Electronics/Computers category.

In accordance with Public Law 96-517, the contractor has elected to retain title to this invention. Inquiries concerning rights for its commercial use should be addressed to:

Innovative Technology Assets Management
JPL
Mail Stop 202-233
4800 Oak Grove Drive
Pasadena, CA 91109-8099
(818) 354-2240 E-mail: This email address is being protected from spambots. You need JavaScript enabled to view it.

Refer to NPO-44209, volume and number

of this NASA Tech Briefs issue, and the page number.



This Brief includes a Technical Support Package (TSP).
Document cover
Synthetic Foveal Imaging Technology

(reference NPO-44209) is currently available for download from the TSP library.

Don't have an account? Sign up here.