electronicsman
Full Member level 5
I have a doubt where i want to know how accurately i am measuring the voltage and current. Suppose if i use a micro controller and i am reading a analog voltage using ADC what are the factors i should consider to know how accurately i am measuring.
Input let us say i apply 5 volts and i measure 4.95, can i say i am reading with accuracy of 0.05V? Few doubts i have how do i know that i have applied 5V itself there could be some errors at the input end also. The other problem is every time i read the adc value the value may change like 4.95, 4.96,4.95 etc. So, should i consider the worst value as the accuracy? One of the things also comes into mind is whether it is 10bit, 12bit adc etc. So, all the calculations should be theoretical calculations to reach the accuracy or the practical calculations and should do it for the entire range of the input voltage? Do i need to use any calibration tools? Please advise.
Input let us say i apply 5 volts and i measure 4.95, can i say i am reading with accuracy of 0.05V? Few doubts i have how do i know that i have applied 5V itself there could be some errors at the input end also. The other problem is every time i read the adc value the value may change like 4.95, 4.96,4.95 etc. So, should i consider the worst value as the accuracy? One of the things also comes into mind is whether it is 10bit, 12bit adc etc. So, all the calculations should be theoretical calculations to reach the accuracy or the practical calculations and should do it for the entire range of the input voltage? Do i need to use any calibration tools? Please advise.