A computer program introduces a predistortion into the chirp signal used to frequency-modulate a synthetic-aperture-radar (SAR) signal at the transmitter. The predistortion is intended to compensate for distortions introduced by nonideal performance of the various pieces of SAR equipment — especially the transmitter. More specifically, the predistortion is intended to make the signal in the receiver an ideal SAR signal in that it exhibits optimum correlation performance. The program is run as a calibration routine. It is used in conjunction with a closed calibration loop that includes the SAR transmitter and receiver, plus an arbitrary-waveform generator (AWG), which generates the predistorted chirp under computer control. The program causes the AWG to adjust the predistorted chirp iteratively until the chirp in the receiver differs from an optimum or reference chirp by less than a prescribed small measure of error. In a test in which a bad elliptical filter was used to simulate (with respect to distortion) the transmitting and receiving equipment of an SAR system, the program succeeded in generating a simulated receiver chirp within less than 1 dB of optimum.
This program was written by Andrew Berkun of Caltech for NASA's Jet Propulsion Laboratory. For further information, access the Technical Support Package (TSP) free on-line at www.nasatech.com/tsp under the Software category.
This software is available for commercial licensing. Please contact Don Hart of the California Institute of Technology at (818) 393-3425. Refer to NPO-20420.
This Brief includes a Technical Support Package (TSP).

Software for Obtaining Ideal SAR Chirps
(reference NPO-20420) is currently available for download from the TSP library.
Don't have an account?
Overview
The document is a technical support package from NASA's Jet Propulsion Laboratory (JPL), detailing a new technology developed by inventor Andrew Berkun aimed at improving synthetic-aperture radar (SAR) signal processing. The primary focus is on a computer program that enhances the quality of radar signals by compensating for distortions introduced by the radar system's components, specifically the arbitrary waveform generator (AWG) and analog-to-digital converters (ADCs).
The program employs a predistortion technique to modify the chirp signal before transmission, ensuring that the received signal closely resembles an ideal SAR signal. This is crucial for achieving high-quality radar imaging, as distortions can significantly degrade the performance of radar systems. The document outlines the challenges faced in radar signal processing, including differences in sample rates between the AWG and ADCs, which necessitate the use of a discrete Fourier transform instead of a fast Fourier transform (FFT). This approach, while more memory-intensive, helps mitigate noise accumulation and band edge effects.
Key performance metrics discussed include the Integrated Side Lobe Ratio (ISLR), which measures the energy concentration of the main lobe of the radar signal relative to the energy of side lobes. A higher ISLR indicates better signal quality, characterized by a large main peak and minimal secondary peaks. The document explains the process of computing an error transform to correct for distortions, involving the manipulation of the reference function and the distorted ADC data.
The program's effectiveness is highlighted by its ability to produce a simulated receiver chirp that is within 1 dB of optimal performance, showcasing its potential for real-world applications in radar technology. The document also emphasizes the importance of calibration routines and the need for potential correction transforms based on system changes over time and temperature.
Overall, this technical support package presents a significant advancement in radar signal processing, with implications for various applications in aerospace and remote sensing. For commercial licensing inquiries, the document directs interested parties to contact Don Hart at Caltech.

