hacksgen
Member level 3
this is with regards to dual slope adc.
assumin input signal frequency in 10khz. what should the integration period be.
as i understand the sampling frequency should be atleast 2 times the input frequency or more.
i assume 10times the input frequency which gives me 10*10=100khz.
i am guessing that this frequency is the integration time.so we get 1/100khz = 10microseconds. am i right in my understanding.
I havent understood why they keep saying in the books that the conversion time for dual slope is greater than 1milliseconds.
If i am wrong then please explain me how the integration time should be calculated based on the input signal frequency.
thanks for your help
assumin input signal frequency in 10khz. what should the integration period be.
as i understand the sampling frequency should be atleast 2 times the input frequency or more.
i assume 10times the input frequency which gives me 10*10=100khz.
i am guessing that this frequency is the integration time.so we get 1/100khz = 10microseconds. am i right in my understanding.
I havent understood why they keep saying in the books that the conversion time for dual slope is greater than 1milliseconds.
If i am wrong then please explain me how the integration time should be calculated based on the input signal frequency.
thanks for your help