The exponential growth of wireless communication systems in recent years has initiated the need for low-noise, low-power and low-cost wireless components. To fulfill these requirements, CMOS technology is being used enormously for constructing RF blocks of communication systems and is going to replace its traditional GaAs and silicon bipolar counterparts. This replacement has the advantage of using the large implementation capacity of CMOS technology and providing the opportunity to fabricate all the analog, digital and RF blocks of a system on a single chip which is called SoC (System-On-a-Chip). Among different RF blocks, designing a low phase noise VCO is of great concern because the phase noise of the local oscillator in a communication system is one of the most important parameters that can affect the quality of the signal. This importance is sensed more deeply in OFDM systems in which the modulation technique is highly sensitive to the phase noise. A great deal of time and effort has been spent to understand the sources of phase noise in oscillators but because of two reasons, coming to a comprehensive conclusion is a very difficult task: 1- Large signal operation of the oscillators and inability to use linear small signal models. 2- Time-variant nature of phase noise generation during the oscillation period. In this thesis, different phase noise models and theories of CMOS LC oscillators, as the most popular oscillators in RF applications, are studied and a new model that considers the noise of differential pair transistors in triode region is proposed. This noise is usually ignored but by using this new model, we improve the existing theory of phase noise in these oscillators and then we introduce a more accurate justification for performance of one of the most popular phase noise reduction techniques. The conclusions are validated through the circuit simulations of a 3.6 GHz NMOS oscillator using a 0.18 ?m technology in Agilent ADS 2005A.