Hello
I made a simulation of the PLL. Its parameters are:
reference freq: 3.9 mHz
output freq in lock: about 308 MHz
freq divider ratio: 79
Charge pump current: 25uA
I used three state PDF/CP and 2nd order passive loop filter.
I simulated close loop phase noise of PLL with two configurations of the filter:
1) loop bandwidth 100kHz
2) loop bandwidth 400kHz
I noticed that in 2) case there are few spurs nearby 10 MHz offset from the fundamental frequency (308 MHz).
In the 1) case there are no spurs.
So, my question is: what causes the spurs where the loop bandwidth is bigger?
I know that increasing this bandwidth results in decreasing spurs attanuation. It can be one reason.
What about PFD dead zones? Is their influence more significant where bandwidth is bigger.
If you are achieving BW improvement just by changing the loop parameters, it normally takes a higher valued resistor and much lower valued major cap for the same phase margin. This means higher spurs at the vctrl node.
PFD-CP mismatch induced spurs can show variations with reference clock period but not with BW.
The spur is observed in which simulation? PSS+pnoise or dft from transient?
I observe Phase noise calculated from transient analysis.
The part of user guide:
If the PLL noise performance is of interest, add a freq_meter instance to the test bench.
This instance measures the periods of VCO output in response to rise cross events and writes the periods into a file. After a transient analysis, a plugin searches this instance and the file and calculates the phase noise power spectrum density (PSD) from the periods in the file.
I made the mentioned simulation for circuit with ideal (written in verilogA) frequency divider and with the treansistor level one. As a reference signal I was using the sinosuidal source (3.9 MHz) with 50 ps as a falling and rising time.
I noticed that when I used ideal divider the phase noise presented at the graph was about 5 dBc/Hz worse than in the second case.
I thougt that divider has not a big influence on phase noise of PLL signal.
Can anybody explain me this result and why I obtained the worse value for circuit with ideal frequency diviver?
Thanks for the answer.
This instance measures the periods of VCO output in response to rise cross events and writes the periods into a file. After a transient analysis, a plugin searches this instance and the file and calculates the phase noise power spectrum density (PSD) from the periods in the file.
Sorry, maybe I didn't write it clearly enough.
I made two transient analysis with added instance, which helped to measure close loop phase noise of a PLL circuit (with loop bandwidth 100kHz):
1) with the ideal frequency divider written in Verilog A
2) with transistor level frequency divider
And after simulations it turned out that for the case 1), the phase noise was worse (about 5 dBc/Hz). I'm just curious is it possible that the circuit with ideal freq divider, has worse quality of output signal.