Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Voltage and Current Measurement

Status
Not open for further replies.

electronicsman

Full Member level 5
Joined
May 4, 2012
Messages
291
Helped
11
Reputation
22
Reaction score
12
Trophy points
1,298
Activity points
3,737
I have a doubt where i want to know how accurately i am measuring the voltage and current. Suppose if i use a micro controller and i am reading a analog voltage using ADC what are the factors i should consider to know how accurately i am measuring.
Input let us say i apply 5 volts and i measure 4.95, can i say i am reading with accuracy of 0.05V? Few doubts i have how do i know that i have applied 5V itself there could be some errors at the input end also. The other problem is every time i read the adc value the value may change like 4.95, 4.96,4.95 etc. So, should i consider the worst value as the accuracy? One of the things also comes into mind is whether it is 10bit, 12bit adc etc. So, all the calculations should be theoretical calculations to reach the accuracy or the practical calculations and should do it for the entire range of the input voltage? Do i need to use any calibration tools? Please advise.
 

Hi,

Input let us say i apply 5 volts and i measure 4.95, can i say i am reading with accuracy of 0.05V?
There is
* accuracy
* precision
* resolution

If you want high accuracy, then I recommend to take a lot of measurements and calculate the average.
This cancels out precision errors and increases resolution (although this often is not much of benefit)

* accuracy problems are manly caused by drift (VRef, resistor drift, ADC gain drift offset drift and linearity, Opamp offset..)
* precision problems mainly are caused by noise, quatisation error (with AC signals: additionally conversion clock jitter)

*******
On a real ADC this means:
* if VRef fluctuates by x%, then the digital result will also fluctuate by x%
--> thus using VCC as VRef is not recommended
* when - because of temperature coefficient - a resistor value of a voltage divider drifts by x% , then the digital output value will also drift by x%
--> use high quality, low toleeance, low tc resistors
* when the offset voltage fluctuates (ground bounce, Opamp offset voltage drift) by x millivolts, then the digital output will also drift by x mV
--> carefully design the PCB layout, use good, low drift Opamps
* x millivolts of signal input noise (resistor noise, Opamp noise, but also coupled noise - maybe from mains or an SMPS) will cause x mV of noise at the digital output
--> use low pass filters in the signal path, use filters in analog power supply and ADC power supply, use digital filters
* offset error and gain errors (both non fluctuating) can simply calibrated by software
--> subtract offset_value from the ADC_reading and multiply the result with gain_correction to get more accurate results

You may write a simple interrupt driven ADC routine that continously reads the ADC, does simple (integer) offset correction, gain correction and noise filtering.
Store the value in a "volatile, static" variable.
Then you always have immediate (no need for waiting for ADC_conversion_complete) and simple access to clean and reliable ADC readings in the main loop. Mind to use "atomic access".
Once written this routine .... you further may forget about it ... it won't influence your other software.

Klaus
 

The main question is my customer asks me what is the accuracy with which i can read the voltage (ex: range +-0.3V)? How do i answer him or what are tests i need to do to get that value and show as proof?
 

Hi,

What about a decent voltage reference measured at a known temperature like 25C (or range of temperatures) or a similar test? Or measuring 0V and 0A input to see the error, if any, then maybe measuring 1V and 1A? In some thread I am sorry I don't remember, Schmitt Trigger or WFeldman mentioned measuring something at 0C to calibrate or check accuracy.
 

Hi,

If you want to calculate accuracy you need to tell the whole story.
If only the microcontroller is involved in the measurement, then you will find the accuracy specification in it's datasheet.

But I assume there is more...
then show the whole schematic, with complete informations.

Klaus
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top