Phase noise of microwave free running sources has always been an important problem in various applications. This noise generates an increased bit error rate in a telecommunication link and degrades the sensitivity of a radar (particularly in the case of Doppler or FM-CW radar). Reducing this noise contribution is a difficult challenge for microwave engineers and circuit designers. The main contributor to this noise is well known to be the microwave transistor and finally an improvement of the oscillator phase noise will result from an optimization of the transistor phase noise. The 10 kHz to 1 MHz offset frequency range is the most important frequency range for many microwave oscillators applications. An improvement of the transistor (or oscillator) phase noise in this frequency range cannot be obtained without a good knowledge of the noise mechanisms involved in the device. In this frequency range, two different mechanisms may be at the origin of the phase noise. The first one involves the conversion to high frequencies of the transistor baseband noise (or 1/f noise) through the devices nonlinearities. The second one is due to the direct superposition of the transistor high frequency noise. This noise is simply added to the carrier, and this contribution may be described using the amplifier noise figure. In this paper, the evidence of the transistor high-frequency noise contribution in residual phase noise data is demonstrated. This behavior is observed in several bipolar devices in which the low-frequency noise contribution has been carefully minimized using an optimized bias network. Then, the phase noise behavior is correlated to nonlinear noise figure measurements. This study has been carried on numerous different microwave transistors, including FET and bipolar devices. An increase of the noise figure with the microwave signal level has been observed in each case.
|