What is ideal way to choose the tracking bandwidth of A/D converter??Does it has any relation to the sampling frequency??I know it should be greater than the incoming signal BW but does it has any relation with the sampling rate??Please help.
Yes: The max. RC time constant τ(RC) of the S/H circuit depends on the required resolution ((in-)accuracy, given as no. of bits = b) of the ADC, its sampling frequency f and the necessary hold time (th) for the conversion. In order to load the sample capacitor up to the required accuracy b, n=ln(2^b) RC time constants are necessary, hence the necessary sampling time is ts=ln(2^b)•τ(RC), i.e. e.g. ≈7τ(RC) for a 10bit ADC .
The max. sampling frequency f is the inverse of the sum of sampling time (ts) and hold time (th) : f ≤ 1/(ts+th) .
Thanks erikl .Can you point me to the paper/book which has the reasoning for n=ln(2^b) RC formula??
Also it is told that the RC time constant in general is much larger than the sampling frequency??I understand RC bandwidth should be larger than the highest frequency which one needs to sample.But why it should be much larger than the sampling frequency??Please comment,
It's simple electrotechnics; you should find it in any book about this matter:
The loading curve of a capacitor C via a resistor R is V/V0 = 1-e^(-t/τ) with τ=RC .
Then the error (the difference between the voltage at the sample cap and the voltage to be sampled) after time=t is 1-(V/V0) = e^(-t/τ) , and this error should be equal or less than the resolution of the ADC, which is 1bit/(2^b)bit.
I understand RC bandwidth should be larger than the highest frequency which one needs to sample.But why it should be much larger than the sampling frequency??Please comment