Minimizing Input-to-Output Latency in Virtual Environment
- Wednesday, 02 September 2009
A method and apparatus were developed to minimize latency (time delay) in virtual environment (VE) and other discrete-time computer-based systems that require real-time display in response to sensor inputs. Latency in such systems is due to the sum of the finite time required for information processing and communication within and between sensors, software, and displays. Even though the latencies intrinsic to each individual hardware, software, and communication component can be minimized (or theoretically eliminated) by speeding up internal computation and transmission speeds, time delays due to the integration of the overall system will persist. These “integration” delays arise when data produced or processed by earlier components or stages in a system pathway sit idle, waiting to be accessed by subsequent components. Such idle times can be sizeable when compared with latency of individual system components and can also be variable in duration because of insufficient synchrony between events in the data path. This development is intended specifically to reduce the magnitude and variability of idle-time type delays and thus enable the minimization and stabilization of overall latency in the complete VE (or other computer) system.
This work was done by Bernard D. Adelstein and Stephen R. Ellis of Ames Research Center and Michael I. Hill of San Jose State University Foundation.
This invention is owned by NASA and a patent application has been filed. Inquiries concerning rights for the commercial use of this invention should be addressed to the Ames Technology Partnerships Division at (650) 604-5761. Refer to ARC-15102-1.
This Brief includes a Technical Support Package (TSP).
Minimizing Input-To-Output Latency in Virtual Environment (reference ARC-15102-1) is currently available for download from the TSP library.
Please Login at the top of the page to download.