Two image-data-processing algorithms are essential to the successful operation of a system of electronic hardware and software that noninvasively tracks the direction of a person’s gaze in real time. The system was described in “High-Speed Noninvasive Eye-Tracking System” (NPO-30700) NASA Tech Briefs, Vol. 31, No. 8 (August 2007), page 51.

The Vector Between the Centroids of pupil and corneal reflections is computed and then used to compute the direction of gaze and the gaze point.
To recapitulate from the cited article: Like prior commercial noninvasive eye-tracking systems, this system is based on (1) illumination of an eye by a low-power infrared light-emitting diode (LED); (2) acquisition of video images of the pupil, iris, and cornea in the reflected infrared light; (3) digitization of the images; and (4) processing the digital image data to determine the direction of gaze from the centroids of the pupil and cornea in the images. Most of the prior commercial noninvasive eye-tracking systems rely on standard video cameras, which operate at frame rates of about 30 Hz. Such systems are limited to slow, full-frame operation.

The video camera in the present system includes a charge-coupled-device (CCD) image detector plus electronic circuitry capable of implementing an advanced control scheme that effects readout from a small region of interest (ROI), or subwindow, of the full image. Inasmuch as the image features of interest (the cornea and pupil) typically occupy a small part of the camera frame, this ROI capability can be exploited to determine the direction of gaze at a high frame rate by reading out from the ROI that contains the cornea and pupil (but not from the rest of the image) repeatedly.

One of the present algorithms exploits the ROI capability. The algorithm takes horizontal row slices and takes advantage of the symmetry of the pupil and cornea circles and of the grayscale contrasts of the pupil and cornea with respect to other parts of the eye. The algorithm determines which horizontal image slices contain the pupil and cornea, and, on each valid slice, the end coordinates of the pupil and cornea. Information from multiple slices is then combined to robustly locate the centroids of the pupil and cornea images.

The other of the two present algorithms is a modified version of an older algorithm for estimating the direction of gaze from the centroids of the pupil and cornea. The modification lies in the use of the coordinates of the centroids, rather than differences between the coordinates of the centroids, in a gaze-mapping equation. The equation locates a gaze point, defined as the intersection of the gaze axis with a surface of interest, which is typically a computer display screen (see figure). The expected advantage of the modification is to make the gaze computation less dependent on some simplifying assumptions that are sometimes not accurate.

This work was done by Ashit Talukder, John-Michael Morookian, and James Lambert of Caltech for NASA’s Jet Propulsion Laboratory. For more information, download the Technical Support Package (free white paper) at www.techbriefs.com/tsp under the Information Sciences category.

In accordance with Public Law 96-517, the contractor has elected to retain title to this invention. Inquiries concerning rights for its commercial use should be addressed to:

Innovative Technology Assets Management JPL Mail Stop 202-233 4800 Oak Grove Drive

Pasadena, CA 91109-8099 E-mail: This email address is being protected from spambots. You need JavaScript enabled to view it.

Refer to NPO-30699, volume and number of this NASA Tech Briefs issue, and the page number.



This Brief includes a Technical Support Package (TSP).
Document cover
Algorithms for High-Speed Noninvasive Eye-Tracking System

(reference NPO-30699) is currently available for download from the TSP library.

Don't have an account? Sign up here.