Special Coverage

Supercomputer Cooling System Uses Refrigerant to Replace Water
Computer Chips Calculate and Store in an Integrated Unit
Electron-to-Photon Communication for Quantum Computing
Mechanoresponsive Healing Polymers
Variable Permeability Magnetometer Systems and Methods for Aerospace Applications
Evaluation Standard for Robotic Research
Small Robot Has Outstanding Vertical Agility
Smart Optical Material Characterization System and Method
Lightweight, Flexible Thermal Protection System for Fire Protection

The predictor used here is computed directly from a measured open-loop disturbance sequence using an efficient subspace identification algorithm.

Current science objectives, such as high-contrast imaging of exoplanets, have led to the development of highorder adaptive optics (AO) systems possessing several thousand deformable mirror (DM) actuators. These systems typically rely on integrator-based control architectures, where the temporal error rejection bandwidth is limited by the computational latency between wavefront measurement and application of the DM commands. In many systems, this latency is the driving factor behind residual wavefront error.

Wavefront regulation in large AO systems involves the simultaneous control of several thousand control channels, and hence requires a large degree of real-time computation. In certain systems, this computational burden is distributed to multiple graphical processing units (GPUs), which can parallelize the large vector-matrix multiplication (VMM) operations needed for control. Current GPU-based AO systems are generally restricted to scalar integral control applied to each channel; however, this architecture is not ideal to mitigate most forms of dynamic turbulence. The standard integrator approach also lacks any form of prediction capability, and hence is unable to compensate for the inherent loop latency between wavefront measurement and compensation.

For an AO system consisting of a closed integrator and an integer number of delays, the linear time-invariant (LTI) controller that minimizes the mean-squared residual wavefront error is a multichannel Kalman predictor for the incident turbulence sequence. The predictor used here is not generated using standard approaches for Kalman filter design, which require a priori knowledge of the disturbance statistics. Instead, it is computed directly from a measured open-loop disturbance sequence using a computationally efficient subspace identification algorithm.

In addition to its architecture, this software utilizes a novel VMM technique to improve computational efficiency. To improve spatial locality, and hence the performance, of vectormatrix multiplications (y=Ax), each matrix A of dimension M × N is partitioned into M/m horizontal stripes of column-major order, which are processed in parallel each by a thread block of m × n threads such that each thread processes N/n matrix elements.

A key advantage of the identification approach demonstrated here is that a prediction filter can be quickly identified in a variety of turbulence conditions and telescope orientations. Hence, the observed optimal control performance should remain similar even when mitigating strong, high-bandwidth disturbances.

This work was done by Jonathan A. Tesch, Tuan N. Truong, and Rick S. Burruss of Caltech; and Steve Gibson of UCLA for NASA’s Jet Propulsion Laboratory. This software is available for license through the Jet Propulsion Laboratory, and you may request a license at: here. NPO-49591

The U.S. Government does not endorse any commercial product, process, or activity identified on this web site.