Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Delay in the feedback loop and PLL Stability Analysis

Status
Not open for further replies.

rafael2323

Newbie level 4
Joined
Mar 16, 2007
Messages
5
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Activity points
1,317
Hi all,

I have a system that works as an analog PLL (PFD, loop filter, CO, N divider) to stabilize the phase variations of a reference. (see attachment)
However, there is a significant delay – 50 microseconds -, due to long distance transmission, in the feedback loop between the VCO and the N divider.

Which is the best way to analyze the stability in this system?

I tried with Root locus in Matlab but the delay must be modelled as a rational function and rational approximation of a linear delay (Pade approximations) are only valid in low frequency, whereas the delay has an input frequency of 100 MHz. Using a 1st order Butterworth low pass is not valid either (the filter must be very narrow to get the 50us delay).

Any idea how to analyze this?

Where I can get information about the analysis and influence of a delay in PLL stability?

Thanks a lot.
Rafael

Delay ProblemD.png
 

In principle, you can calculate the phase shift for a certain frequency caused by the delay and substract this phase from the margin (without delay).
But you also can "realize" a fixed delay for example using a properly designed transmission line model or an artificial voltage source exp(-sT).
 
Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top