The Importance of Low Phase Noise and How to Measure It
The Importance of Low Phase Noise and How to Measure It
Download the PDF
Phase noise is present in all electronic signals. Phase noise is caused by the unintentional phase modulation of an RF waveform, no RF signals are perfect. The phase noise component will then reduce the fidelity of an RF signal which can have significant effects, causing higher symbol errors in communications systems or masking returns in radar systems. To describe phase noise another way, the same phenomena in the time domain is called jitter. Like phase noise, it will affect the signal’s overall fidelity. One of the main ways to optimize system performance is by reducing phase noise as much as possible. Due to phase noise’s potential significant negative impacts, it is essential to be able to optimally measure phase noise throughout the testing process.
What is Phase Noise?
To understand phase noise, we first need to define an ideal signal. For RF, an ideal signal would be represented as a sine wave. This ideal sine wave can be represented mathematically as:
V(t) = A0Sin(ɷ0(t))
Where:
A0 = nominal amplitude
ɷ0 = nominal frequency
This ideal signal can then be plotted both in the time domain and in the frequency domain.
Time Domain
|
Frequency Domain
|
|
|
Now that we know what an ideal RF signal looks like, we can consider a more real- world representation of an RF signal. With real-world signals, there will be some
amount of random noise being added to the signal. This noise can be caused by any number of factors in a circuit’s design or even caused by the environment in which the circuit exists. To account for this, the real-world signal can be represented in the following mathematical formula.
V(t) = (A0 + E(t)) *Sin(ɷ0(t) + φ(t))
Where:
A0 = nominal amplitude
ɷ0 = nominal frequency
E(t) = random amplitude change Φ(t) = random phase change
This real-world signal can then be plotted both in the time and frequency domains.
Time Domain
|
Frequency Domain
|
|
|
Phase Noise Effects on System Performance
Phase noise affects system performance in a wide number of applications. Consider two key systems that demonstrate how a device with higher phase noise will degrade overall performance, a radar and digital communications system.
Radar
A simplified radar system block diagram is shown below. Potential significant sources of phase noise are the local oscillator (LO) and amplifiers. The diagram below shows how phase noise will affect the radar system itself. The LO enables an intermediate frequency (IF) signal, which could be modulated, to be upconverted to the final transmitted microwave frequency. This same LO also enables the received microwave signal to be downconverted to an IF so that the received signal may be digitized and processed. Ideally, the LO would generate a discrete CW tone at a single frequency. However, in practice, the LO signal is not perfect and contains imperfections including phase noise as shown in red as the spectral elements.
Radar systems are heavily affected (the receiver is desensitized) by high phase noise performance, since a component’s phase noise can mask the radar signature that is returned. To illustrate this, refer to the image below. In dark blue, is a low phase noise LO and in light blue is the downconverted radar return. Using an LO with higher phase noise (in red) we can see how poor phase noise performance will mask a smaller radar return signal.
Digital Communication
Communication systems are also affected by phase noise. The more complex the modulation schemes employed by the communication system, the more the component’s phase noise will affect the system’s overall performance. The diagram below shows a simplified view of the receiver to explain how phase noise will affect the system. This LO enables the received signal to be downcoverted to an IF so
that the received signal may be digitized and processed. Much like in the radar subsystem, ideally the LO would generate a discrete CW frequency tone.
Below is a QPSK signal that has some phase noise causing the signal to spread, but each symbol can be accurately decoded. In this example the phase noise performance of the LO does not affect the symbol error rate (SER) of the signal.
When that same phase noise is then applied to a 16QAM signal, it can be seen how the receiving device may misinterpret the symbol causing the SER to increase.
In order to have a low SER similar to the QPSK signal, the communication system will have to integrate a lower phase noise LO. This can be seen in the 16 QAM plot below.
Spectrum Analyzer Method for Measuring Phase Noise
Phase noise measurements were initially made using a spectrum analyzer. To perform this measurement on a spectrum analyzer, the instrument captures the entire spectrum of the signal. However, since phase noise is symmetrical, only one side of the spectrum is required – which is referred to as a single sideband (SSB) phase noise measurement. The spectrum analyzer captures the spectrum with a specific resolution bandwidth (RBW) that can vary between spectrum analyzers. To normalize the measurement across spectrum analyzers, the measured value is referenced to the carrier signal (dBc) and the power spectral density is converted to a 1Hz bandwidth. This results in the following units for phase noise, dBc/Hz.
To be able to reference one phase noise measurement to another, a specific frequency offset from the carrier may be included. The known offset allows the measurement to be repeatable from one DUT to the next. The image below shows how the phase noise is measured using this method; the section of the plot in orange shows where the phase noise would ideally be measured for this example.
-x dBc/Hz at offset y Hz
Spectrum Analyzers Limitations
A spectrum analyzer has several limitations when measuring phase noise. These are attributed to the phase noise of the instrument itself, the dynamic range of the instrument, the RBW, and filter used to obtain the RBW.
-
Phase Noise Present in the Spectrum Analyzer: Due to how a spectrum analyzer is designed, the instrument’s own phase noise cannot be removed from the measurement. Whereas a phase noise analyzer has two references that can be used to measure and mathematically eliminate the phase noise of the instrument. For the spectrum analyzers phase noise performance to not affect the measurement, the analyzer should have a better phase noise specification than the device-under-test (DUT) – typically with a ~10 dB margin.
-
Dynamic Range Limitations: Since the carrier is included in the phase noise measurements, the spectrum analyzer’s dynamic range capability can limit the available range in which phase noise measurements can be made, especially at low carrier power levels or if the DUT phase noise is much lower than the carrier’s amplitude.
If the phase noise of the signal being measured is low, relative to the carrier amplitude, the effective phase noise measurement range will be limited by the spectrum analyzer noise floor.
-
Resolution Bandwidth: Spectrum analyzers measure power with an RBW filter that is usually more than 1 Hz wide. The noise power measured by wider RBW filters must be normalized to a 1 Hz bandwidth. This normalization is done by reducing the measured noise power value by N dB, where N = 10*log(RBW in Hz).
-
Filter Shape: A real-world RBW filter is not perfectly rectangular and usually has a Gaussian or similar shape. Therefore, an estimated scaling or correction factor must be applied.
How a Phase Noise Analyzer Measures Phase Noise
Phase noise analyzers present spectrum data in a similar way to a spectrum analyzer, but there are some key differences that allow these types of instruments to make more accurate amplitude measurements. The image below shows how a spectrum analyzer measures the signal’s spectrum.
The section in the blue box on the right-hand side of the carrier signal corresponds to a phase noise analyzers display. The image below shows how a phase noise analyzer will present similar data, where the blue box in the image above corresponds to the working area of the phase noise analyzer.
Advantages of Phase Noise Analyzer
Since the phase noise analyzer filters out the carrier, the instrument has more range available for phase noise measurements. However, even with the added measurement range the bottom of range will still be dictated by the noise floor of the instrument. Most phase noise analyzers show an estimated noise floor of the instrument, in the image below it is the shaded portion of the plot. Ideally, there should be >10 dB offset from the measured noise floor of the DUT to have the highest confidence in the measurement.
Many phase noise analyzers employ cross-correlation. Cross-correlation splits the DUT signal into two uncorrelated measurement channels. These two measurement channels are then compared. The phase noise common to both channels is that of the DUT rather than the instrument. More comparisons, correlations, filter out an increasing amount of instrument phase noise up to some finite limit. There is a relationship between the number of cross-correlations and the analyzer noise floor reduction which is shown below.
Improvement factor: dB = 5logN (N = the number of correlations)
Number of Correlations
|
1
|
10
|
100
|
1,000
|
10,000
|
dB Improvement
|
0 dB
|
5 dB
|
10 dB
|
15 dB
|
20 dB
|
Summary
Some level of phase noise is present in all signals, which degrades a signal’s fidelity. The effect of phase noise can be seen in the degree to which it degrades a radar system’s receiver sensitivity and degree in which it increases the SER of a digital communication system. Due to the importance of phase noise in a system’s performance, it is essential that phase noise is measured in the most optimal manner possible. While a spectrum analyzer is an excellent tool for measuring overall spectral content of a signal, a phase noise analyzer is a significantly better tool for measuring phase noise. Moreover, an instrument that can measure its own noise floor will provide the highest degree of measurement confidence.