Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Calibrate dc voltmeter while production....

Status
Not open for further replies.

jit_singh_tara

Full Member level 6
Joined
Dec 22, 2006
Messages
325
Helped
9
Reputation
18
Reaction score
4
Trophy points
1,298
Location
Delhi , India
Activity points
4,295
Dear Friends ,

I designed a dc voltmeter using microcontroller renesas.

When i put 3 similar units for testing all 3 shows different values.
example : for dc 24V supply .

Meter1 : 23.5V dc.
Meter2 : 23.8 V dc.
Meter3 : 23.6 V Dc.

when i checked the Vcc it was slightly different in 3 meters as follows :

Meter1 : 4.991 V.
Meter2 : 4.975 V.
Meter3 : 4.987 V.

Now since the supply voltage is considered for reference voltage for adc conversion so obviously the difference.
How can i make the reading correct from production point of view .I dont want to use trimpot .
How to calibrate while production......?
 

using the supply voltage of the controller , a separate stable reference generation is one option.
 
Hi,

Now
since the supply voltage is considered for reference voltage for adc conversion so obviously the difference.

You could also use mains voltage as reference or an RC oscillator as frequency reference, but this is no good design.

There are dedicated voltage reference ICs. They guarantee a precise low noise voltage: over time, over temperature, over load, over mechanical stress...

But a voltage regulator is not made for this precision.
For sure you can calibrate the error to get a precise voltage reading, but it is not sure, that tomorrow it shows the same result.
**********

you need to clarify what you want to calibrate.
* you could just do a gain calibration for 24V
* or offset, then gain
* or additionallly linearity
* or additionally temperature compensation.


with a voltage meter a usefull calibration could be like this: (offset and gain calibration)
* switch device on and wait at least for one minute
* use two measurements. one at a known low voltage (maybe 5% FS) and one at a known high voltage (maybe 95% FS)
* with this calculate the offset and gain calibration values.

val_calibrated = (val_uncalib - offset) * gain

...is a simple math for any microcontroller.

Klaus
 
Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top