amsdesign
Member level 3
A DLL corrects the output (lead and lag) so that the output is in phase with the frequency.
I understand how a positive delay can be created to correct for a lag in the DLL. A simple buffer can create a positive delay.
But the minimum delay possible is 0 when there is no delay element.
How does a DLL correct for a phase lead? I.e how does it create a negative delay in the path of the clock signal to reduce its phase shift?
I understand how a positive delay can be created to correct for a lag in the DLL. A simple buffer can create a positive delay.
But the minimum delay possible is 0 when there is no delay element.
How does a DLL correct for a phase lead? I.e how does it create a negative delay in the path of the clock signal to reduce its phase shift?