Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

interpretation of PLL simulation results

Status
Not open for further replies.

jas2005

Newbie level 5
Joined
Jan 2, 2009
Messages
10
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Activity points
1,368
measuring pll loop bandwidth graph

Hello
I made a simulation of the PLL. Its parameters are:
reference freq: 3.9 mHz
output freq in lock: about 308 MHz
freq divider ratio: 79
Charge pump current: 25uA

I used three state PDF/CP and 2nd order passive loop filter.

I simulated close loop phase noise of PLL with two configurations of the filter:
1) loop bandwidth 100kHz
2) loop bandwidth 400kHz

I noticed that in 2) case there are few spurs nearby 10 MHz offset from the fundamental frequency (308 MHz).
In the 1) case there are no spurs.

So, my question is: what causes the spurs where the loop bandwidth is bigger?

I know that increasing this bandwidth results in decreasing spurs attanuation. It can be one reason.
What about PFD dead zones? Is their influence more significant where bandwidth is bigger.

Thanks in advance for answers
 

If you are achieving BW improvement just by changing the loop parameters, it normally takes a higher valued resistor and much lower valued major cap for the same phase margin. This means higher spurs at the vctrl node.
PFD-CP mismatch induced spurs can show variations with reference clock period but not with BW.
The spur is observed in which simulation? PSS+pnoise or dft from transient?
 

    jas2005

    Points: 2
    Helpful Answer Positive Rating
Thanks for the answer.

I observe Phase noise calculated from transient analysis.

The part of user guide:
If the PLL noise performance is of interest, add a freq_meter instance to the test bench.
This instance measures the periods of VCO output in response to rise cross events and writes the periods into a file. After a transient analysis, a plugin searches this instance and the file and calculates the phase noise power spectrum density (PSD) from the periods in the file.
 

Hello
I have one more question.

I made the mentioned simulation for circuit with ideal (written in verilogA) frequency divider and with the treansistor level one. As a reference signal I was using the sinosuidal source (3.9 MHz) with 50 ps as a falling and rising time.

I noticed that when I used ideal divider the phase noise presented at the graph was about 5 dBc/Hz worse than in the second case.

I thougt that divider has not a big influence on phase noise of PLL signal.

Can anybody explain me this result and why I obtained the worse value for circuit with ideal frequency diviver?
 

jas2005 said:
Thanks for the answer.
This instance measures the periods of VCO output in response to rise cross events and writes the periods into a file. After a transient analysis, a plugin searches this instance and the file and calculates the phase noise power spectrum density (PSD) from the periods in the file.

Then it is prone to numerical errors. Run it with stiff tolerances.

jas2005 said:
I noticed that when I used ideal divider the phase noise presented at the graph was about 5 dBc/Hz worse than in the second case.

I thougt that divider has not a big influence on phase noise of PLL signal.

Can anybody explain me this result and why I obtained the worse value for circuit with ideal frequency diviver?
The question is not very clear to me. What is the difference between the first sim and the second?
 

Hello
Then it is prone to numerical errors. Run it with stiff tolerances.
I used time tolerance for cross event equal to 1 ps. But ok I'll make this tolerance stiffer.

The question is not very clear to me. What is the difference between the first sim and the second?
Sorry, maybe I didn't write it clearly enough.
I made two transient analysis with added instance, which helped to measure close loop phase noise of a PLL circuit (with loop bandwidth 100kHz):
1) with the ideal frequency divider written in Verilog A
2) with transistor level frequency divider

And after simulations it turned out that for the case 1), the phase noise was worse (about 5 dBc/Hz). I'm just curious is it possible that the circuit with ideal freq divider, has worse quality of output signal.
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top