Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

please solve my analog to digital converter doubt

Status
Not open for further replies.

hacksgen

Member level 3
Joined
Jul 20, 2006
Messages
60
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,286
Activity points
1,781
this is with regards to dual slope adc.

assumin input signal frequency in 10khz. what should the integration period be.

as i understand the sampling frequency should be atleast 2 times the input frequency or more.

i assume 10times the input frequency which gives me 10*10=100khz.

i am guessing that this frequency is the integration time.so we get 1/100khz = 10microseconds. am i right in my understanding.

I havent understood why they keep saying in the books that the conversion time for dual slope is greater than 1milliseconds.

If i am wrong then please explain me how the integration time should be calculated based on the input signal frequency.

thanks for your help
 

Isn't it due to the technological limits?
The function shows no limits in time.
 

could u explain it to me more clearly
 

read this:
**broken link removed**
 

In practice, dual slope ADC won't use integration period shorter than 1 ms, cause no suitable resolution could be achived by usual techniques of deintegration interval measurement. You won't find a commercial dual-slope ADC able to acquire 10 kHz signal frequency. As an additional remark, integration period isn't sampling period, the latter would count regarding nyquist criterion.
 

To FvM

i dont understand how resolution is involved here.i mean taking the above example into consideration lets say the integration period is 10us. for 130nm technology with supply of 1.2V and N=12 bit resolution and maximum input voltage of 1V.

we can calculate the input resolution as 1V/2^12 = 244microvolts.

for the clock frequency of the counter we can get it from the integration period as

(2^N)*Tclk=10us ----> Tclk=2.44ns --------------->Fclk=409.6MHz

so assuming that such high speed clock Fclk is available for the counter then it should be possible to design such an adc. isnt it

Please correct me if i am wrong .

To fala

Thanks for pointing out the link.But it doesnt solve the question i asked. Also please dont take the power line noise into consideration here as i assume it wont be present.


Yours answers are much appreciated.
 

You're basically correct!

using a several 100 MHz clock frequency, a considerable resolution could be achieved, at least within the digital circuit. Actually I didn't say, this would be impossible, I said 1st it won't be used in practice and 2nd you won't find a commercial device. In my view, dual-slope as a dominant technique for slow ADC has been almost replaced by sigma-delta. Don't want to analyze this in detail, but you probably could confirm this as an observation.

There are some basic properties, that generally limit dual slope accuracy. Integration capacitor loss factor is one, but only slightly depending on ADC speed. Much more speed dependant is nonideal comparator behaviour. It may be difficult to achieve 12 bit accuracy in analog circuit part at the said speed. I guess, that no designer has actually tried during the last 10 or even 20 years, cause other ADC operation principles most likely are promising more benefit.

But I'm not a IC designer and have no need to decide anything in this field. As an analog designer using ADC, I can say, I didn't start a new design with a dual slope for the last 10 years.

Regards,
Frank
 

    hacksgen

    Points: 2
    Helpful Answer Positive Rating
thanks for the reply.much appreciated
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top