Hello everyone,
I have a basic doubt in Sigma Delta ADC's. All the books say that the higher the number of cycles that we consider at the output, the accuracy of it is better.
Lets say the input is varying between +0.5V and -0.5V and the quantizer gives an output of either 0.5 or -0.5. Assuming zero initial conditions, if the input is 0.2 volts, then we get the correct output from quantizer in 10cycles. If the input is 0.3, then we get the correct output from quantizer in 5 cycles. If the input is 2.5, then we get the correct output from quantizer in 4 cycles. If the input is 0.275, then we get the an output close to input in about 9 cycles.
So, ideally after how many cycles should the output from quantizer be averaged?