rafael2323
Newbie level 4
Hi all,
I have a system that works as an analog PLL (PFD, loop filter, CO, N divider) to stabilize the phase variations of a reference. (see attachment)
However, there is a significant delay – 50 microseconds -, due to long distance transmission, in the feedback loop between the VCO and the N divider.
Which is the best way to analyze the stability in this system?
I tried with Root locus in Matlab but the delay must be modelled as a rational function and rational approximation of a linear delay (Pade approximations) are only valid in low frequency, whereas the delay has an input frequency of 100 MHz. Using a 1st order Butterworth low pass is not valid either (the filter must be very narrow to get the 50us delay).
Any idea how to analyze this?
Where I can get information about the analysis and influence of a delay in PLL stability?
Thanks a lot.
Rafael
I have a system that works as an analog PLL (PFD, loop filter, CO, N divider) to stabilize the phase variations of a reference. (see attachment)
However, there is a significant delay – 50 microseconds -, due to long distance transmission, in the feedback loop between the VCO and the N divider.
Which is the best way to analyze the stability in this system?
I tried with Root locus in Matlab but the delay must be modelled as a rational function and rational approximation of a linear delay (Pade approximations) are only valid in low frequency, whereas the delay has an input frequency of 100 MHz. Using a 1st order Butterworth low pass is not valid either (the filter must be very narrow to get the 50us delay).
Any idea how to analyze this?
Where I can get information about the analysis and influence of a delay in PLL stability?
Thanks a lot.
Rafael