Continue to Site

# A question of phase noise simulation in Cadence

Status
Not open for further replies.

#### OMEsystem

I did a phase noise simulation of a VCO with PNoise in Cadence.
But, I got a simulation result as attachment, the phase noise at 10Hz offset
freqeuncy is higher 13dB than carrier power. Is this correct result? How to
recognize the phase noise by this plot?

I would like to know is it normal to get this plot?

Thanks,

Carrier power is either in watts of dBm. Phase noise is in dBc/Hz. How is it possible to say "phase noise at 10Hz offset
freqeuncy is higher 13dB than carrier power"

biff44 said:
Carrier power is either in watts of dBm. Phase noise is in dBc/Hz. How is it possible to say "phase noise at 10Hz offset
freqeuncy is higher 13dB than carrier power"

As you can see, if we integrate the power for 1 Hz wide, the power at 10Hz
offset is 13dBc, that means the power is higher than carrier 13dB by the plot.
Am I right? What I am wondering is the plot looks strange. Is it reasonable to
obtain the simulation result as this attachment?

thanks a lot,

phase noise question

Is it correct to say that we cannot integrate the phase noise from 10Hz to 1Meg for example?

What is the minimum offset frequency according to Leeson model where we can say that this theoretical results match the practical result?

Status
Not open for further replies.