Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Create a long delay with using inverter gate?

Status
Not open for further replies.

Electric_Shock

Junior Member level 2
Joined
Nov 9, 2017
Messages
20
Helped
0
Reputation
0
Reaction score
0
Trophy points
1
Activity points
223
In my asynchronous SAR ADC, I need a delay of hundreds of nanoseconds for the capacitive DAC to settle. I realize that it is difficult to use the inverter chain to create such a big delay and it results in big power consumption and area. Is there any effective way to create such a long delay? Thanks in advance.
 

as with most analog designs today, they have a digital wrapper. you can control time externally and let the DAC settle very easily. counting clock cycles would work.
 
It is not practical to build a delay that large, not to mention its variations, out of inverters. In the case of a SAR it is best to keep things internal to the SAR and not go outside, especially if the SAR is asynchronous. To build your timer maybe it is a good idea to think about some sort of a replica circuit that tracks the dac delays, not just using a brute force approach as with the chain of inverters delay.
 
Long delays made with leisurely ramps tend to pick up
a lot of jitter. Aperture jitter is a problem in applications
where you are trying to use the ADC data stream to
do frequency domain "stuff", it adds frequency spurii.

A timing ramp and comparator can at least address the
power problem better than a bazillion inverters in a string.
Another approach could be a PLL / DLL working from the
master clock and producing derivative clock phases.
But jitter / phase noise may remain an issue to be solved.
 
as with most analog designs today, they have a digital wrapper. you can control time externally and let the DAC settle very easily. counting clock cycles would work.
Screenshot_3.png
Can you tell me some keywords related to this technique, The settling time of my DAC look very slow compared to other works although I tried to reduce the on resistance of the switch.
 

You would have to learn a bit of digital design, I can't teach that in a forum post.
 

I realize that it is difficult to use the inverter chain ...

One RC coupled with a simple inverter gate can be used to introduce a fixed delay. You can also use a diode (reverse biased) to discharge the capacitor (in case it is needed).
 

One RC coupled with a simple inverter gate can be used to introduce a fixed delay. You can also use a diode (reverse biased) to discharge the capacitor (in case it is needed).

But as @dick_freebird said, won't a slow ramp pick up noise and jitter?
 

But as @dick_freebird said, won't a slow ramp pick up noise and jitter?

That depends; what is the time interval or frequency range you are interested? IMHO, noise and jitter has nothing to do with whether it is a fast or slow ramp.
 

The ramp dV/dt transforms dV (supply noise and noise
riding on the ramp from whatever source) to dt (jitter).
Slower ramp makes higher time-deviation for the same
amplitude of noise.

Slow enough ramps also bring the possibility of "chatter"
unless you have hysteresis greater than input-referred
noise amplitude.
 

Jitter in this case hardly matters. The delay done with inverters, or even a ramp (unless one goes through lot of trouble making exact ramp ) will have variations with temperature and supply that will be more severe than jitter.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top