In the coming years, the wireless communications segment will expand significantly as 5G private networks launch. Private networks will take advantage of higher frequency and wider bandwidth applications in frequency ranges (FR) 1 and FR2, all the way to the millimeter-wave (mmWave) band. Testing at higher frequencies is complicated due to the range of frequencies, bandwidths, and deployment modes that devices and networks support.
In conjunction with 5G new radio (NR), private base stations (BS) can support connectivity for different spectrum bands (sub-GHz, 1 to 6 GHz, or mmWave). The 5G base station products must pass all the test requirements prior to their release. Otherwise, the products are not 3GPP compatible or appropriate to implement in a network. It is critical to thoroughly understand testing challenges and select testing devices that can provide robust performance to ensure base station transmitters in the network perform to standards.
At higher frequencies such as mmWave and wider bandwidths, signal quality is more susceptible to impairments. To characterize and test the true performance of 5G products, designers and manufacturers need to address testing challenges in these situations. Error vector magnitude (EVM) measurements can offer powerful insight into the performance of a digital communication base station transmitter. In fact, EVM measurements can detect any flaw affecting the magnitude and phase trajectory of a signal, regardless of the modulation format.
Using EVM to Characterize Private Network Signal Quality
The 3GPP standard defines the radio frequency (RF) conformance test methods and requirements for NR base stations in the technical specifications TS 38.141-1 (conducted testing) and TS 38.141-2 (radiated testing). Chapter six of the specifications lays out the requirements for transmission tests in base station transmitters.
For 3GPP standards, EVM measurement is one of the primary metrics to assess the quality of the transmitted signal from the base station. The error vector is the vector difference between the in-phase and quadrature (IQ) reference signal and the measured signal at a given time. EVM measurement is sensitive to any signal impairment affecting the magnitude and phase trajectory of a signal for any digital modulation format. Therefore, EVM is an ideal tool for diagnosing problems in the baseband, the intermediate frequency (IF), or the RF sections of a communication system.
Many factors influence the quality of a signal, including baseband signal processing, modulation, filtering, and up-conversion. Orthogonal properties in orthogonal frequency division multiplexing (OFDM) systems prevent interference between overlapping carriers. However, impairments such as IQ impairments, phase noise, linear compression (AM to AM), and nonlinear compression (AM to PM), as well as frequency errors can distort the modulated signal.
To avoid the impact of signal degradation on design performance, device designers need to overcome the physical challenges present in wide bandwidth and high-frequency signals. Measurement and characterization of signal quality require test solutions with better performance than the device under test (DUT) without introducing new signal impairments. In Figure 1, see examples of IQ modulation impairments that cause constellation symbols to deviate from their defined locations. These deviations provide information about the nature of the impairments based on the type and intensity of deviations.
EVM measurements during conformance testing verify modulation quality is within the limits specified by the minimum requirements. Antenna connectors and transceiver array bound (TAB) connectors must meet the minimum requirements when testing one at a time or in parallel. It is necessary to repeat tests for all antenna and TAB connectors. For example, EVM for NR carriers of different modulation schemes on a physical downlink shared channel (PDSCH) must demonstrate less than the limits that appear in Table 1. It is necessary to perform EVM measurements for all bandwidths on all allocated resource blocks and downlink slots within 10 ms measurement periods.
Challenges for Testing 5G Private Network Base Station Transmitters
Confronting Path Loss
It is possible to perform OTA tests for either the near-field or far-field. While near-field measurements may be appropriate for some applications, 5G cellular communication requires far-field assumptions. The far-field distance and path loss associated with radiated waves increase with frequency due to the nature of the waves. Figure 2 shows how the characteristics of these signals change as they propagate from antenna arrays and develop further in the far-field region.
The greater the path loss and the larger the distance between the DUT and the probe antenna, the more challenging the test. Due to the excess path loss, getting accurate OTA measurements is challenging using higher frequency devices. Because of the excessive path loss between instruments and DUT, signal analysis measurement becomes even more challenging due to the low signal-to-noise ratio (SNR). Low SNR degrades base station transmitter measurements such as EVM and misrepresents the performance of the DUTs. Therefore, 5G private networks require engineers to rethink their design and testing processes.
The wideband noise that contributes to measurement uncertainties and complexity represents another testing challenge for 5G private network base station transmitters at higher frequencies and wider bandwidths. With higher frequencies and wider bandwidths, it is possible to achieve faster data rates, lower latency, higher data throughput, better resolution and accuracy, and higher-order modulation, but it will come with more noise.
To achieve greater sensitivity at the receiver, a transmit signal must compete with the channel’s noise floor. Greater bandwidth increases the rate of the signal transmission over a channel with a specific bandwidth to reach the receiver, but it also introduces more noise into the signal analyzer. The upshot is greater bandwidth reduces the SNR and makes achieving accurate measurements more difficult. Consequently, engineers require careful evaluation when planning their test systems.
A test system’s cables, connectors, switches, and fixtures can impact the frequency response in the path between the signal analyzer and the DUT, including amplitude and phase errors that degrade measurement accuracy.
When testing 5G NR FR2 signals in networks with a wider bandwidth and higher frequency, the frequency response becomes worse. Because the primary objective of a test system is to characterize a DUT, it is necessary to extend the measurement accuracy from a signal analyzer’s input port — its reference plane, to the DUT’s test port — and its test plane. Ensure the system isolates the DUT’s measured results from the effects of the test system itself. Minimizing frequency response errors will help to produce more accurate measurements.
Five Tips for Optimizing EVM to Achieve Higher Frequency in Private Networks
Tip 1: A signal analyzer can incorporate attenuation at high-power levels or use a preamplifier at low-power levels to measure the variety of input signals. In general, signal analyzers provide several RF signal paths such as a default path, a microwave pre-selector bypass, a low-noise path, and a full-bypass path to reduce noise. Therefore, improve sensitivity and SNR. Full bypass reduces path loss, signal fidelity, and measurement sensitivity while avoiding multiple switches in the low-band switch circuitry and bypassing the microwave pre-selectors. Millimeter-wave full bypass paths have up to 10 dB less loss, improve SNR, and produce more accurate EVM measurements.
Tip 2: A high input mixer level can improve SNR, whereas a low input mixer level can reduce distortion. It is important to find the right input mixer level that balances distortion performance with noise sensitivity. In addition, it is critical to keep the path loss between the DUT and the signal analyzer as low as possible when building a test system. Using an external mixer can minimize path loss by moving the test port close to the DUT, thereby shortening the signal path, and enhancing SNR.
However, since there is no preselector at the front of the mixer, strong out-of-band signals can cause some unwanted images to appear in the band of interest, thus reducing measurement accuracy. Using an external frequency extender provides access to a combination of the preselector and RF switch. The combination provides swept power spectrum up to 110 GHz — unbanded and preselected — to achieve an excellent sensitivity.
Tip 3: The phase noise performance of a signal analyzer can also affect EVM measurements. In 5G NR, transmit data in parallel using OFDM, a modulation scheme that uses many closely spaced orthogonal subcarrier signals, each with a different modulation scheme. If the local oscillator (LO) of a signal analyzer has poor phase noise, the subcarrier with phase noise interferes with other subcarriers and lowers modulation quality. There are several methods for phase noise optimization with signal analyzers including best closein, best wide-offset, and fast tuning. Signal analysis for modulation analysis requires consideration to not only the phase noise profile of the signal analyzer, but also its operating frequency, bandwidth, and subcarrier spacing.
Tip 4: A low-noise amplifier (LNA) at the front-end of the signal analyzer reduces the system noise figure with, or without the internal preamplifier to optimize the mixer’s input level. Signal analyzers can feature integrated LNA and preamplifiers for various test scenarios in FR1 and FR2 applications. A two-stage gain optimizes the measurements at low input levels to reduce the noise and provide the best performance while balancing noise and distortion. Figure 3 shows an example of 5G demodulation where turning the LNA on (right) improves the EVM significantly, enabling it to go from 5.75 percent to 1.99 percent.
Tip 5: A signal analyzer’s specifications are usually valid up to the instrument’s input connectors where the instrument sets the reference plane. It is important to consider the impact of the components outside the test instrument in the path between the instrument and the DUT. Components such as cables, connectors, switches, and fixtures can contribute to the degraded measurement accuracy of systems by adding to frequency responses. RF engineers should look for new and better ways to minimize frequency response errors in wide bandwidths and high frequencies such as mmWave and beyond.
Using a reliable receiver calibrator, enables the ability to move the reference plane to the DUT to calibrate the system. Data can automatically transfer from the memory to the signal analyzer and the analyzer can easily detect it. This setup can reduce the effort and complexity of calibrating the test receiver system up to 110 GHz.
Select the Right Test Solution to Ensure Robust Performance
With the rapid evolution of cellular communication systems, there is a growing need for higher operating frequencies and wider bandwidths to support next-generation wireless standards. Over the next decade, thousands of companies will likely deploy private cellular networks. Therefore, it is essential to select testing devices that can provide robust performance, and ensure that the base station transmitters in the network perform to standards. With reliable signal analysis solutions and measurement software applications, keep 5G private network transmitter compliant with ever-evolving standards, while maximizing efficiency and accuracy.
This article was written by Paris Akhshi, Ph.D, Product Marketing Manager, Keysight Technologies (Santa Rosa, CA). For more information, visit here .