Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.
say u have a 10 bit adc and your reference voltage is 5v.
then the minimum change in voltage that u can detect is= 5v/1024 which is 4mV. so with a reference of 5V your resolution is 4mV(you can detect a change only if there is a change of 4mV).
if you want to increase the resolution then either you will have to have a ADC with more bits or a smaller reference.the latter is easier.
now if with the same adc if your ref voltage is 3v
then,Res=3v/1024 = 2mv. so now you can detect a change of 2mV an hence a higher resolution.
therefore, resolution is the smallest amount of change that u can detect.
Hope that helps...
Resolution is that, the smallest change in input signal that you can detect.
If you use higher resolution, you can detect the signal to lowest quantity. For 10 bit ADC, and your reference say about 5V, You can detect the lowest possible change in signal is about = 5v/1024 => 0.0048828125V. ADC counts will increase on every 4.8mV. But if you use 16 Bit ADC, the lowest possible change in signal you can detect is about => 5V/65536 => 0.0000762939453125V. Here ADC counts will increase on every 76uV.
If you need detect very lowest input signal, go for higher bit ADC resolution. That's purely depends on your application.
but when we reduce the reference voltage for a high bit ADC all the quantization levels won't be used up right then could someone please tell how resolution increases.. I was going through VCO based ADC where it was told that as voltage range decreases it would be difficult to accurately quantize the signal and resolution decreases.. sorry if I am wrong but when I posted regarding why it is difficult to accurately quantize as voltage decreases I got the following answer
"For example, if you convert a 2.56 volts signal with a linear 8 bits digital representation that can handle a full range of 0 to 2.56V, you have 10 mV per step
If you apply to it a 0.256 volt maximum signal, the ADC conversion make a quantization that can only use only a little portion (say 25 or 26 steps) of the full 256 steps range that you have for 0 to 2.56 volts
=> the quantization precision is 10 times less precise with a 0.256 maximum volt signal than with a 2.56 maximums volts signal
And as say SunnySkyguy, more the amplitude of the signal is low, more the stray noise have a relatively more weight"
So for 8bit ADC suppose 2.56V/256=0.01 and for 0.256V/256=0.001.... so in latter case smallest chage can be detected but why will the resolution be less... please help