Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Accuracy of a measurement device.

Status
Not open for further replies.

Palpurul

Member level 3
Joined
May 31, 2016
Messages
66
Helped
9
Reputation
18
Reaction score
8
Trophy points
8
Location
Turkey
Activity points
593
I can't get my head around this. Let's say you produced the world's first multimeter and want to determine its accuracy what do you do? Or consider this scenario, you manufactured an RF signal generator that you want to measure its amplitude accuracy and cross checking with another similar product is not an option what do you do? I am not concerned with these two cases specifically(if you know how these two work don't hesitate to share). I just want to learn how accuracy of anything is characterized. When you coross check with something more accurate you sure need to cross check the more accurate device with another more accurate device.
 

I can't get my head around this. Let's say you produced the world's first multimeter and want to determine its accuracy what do you do? Or consider this scenario, you manufactured an RF signal generator that you want to measure its amplitude accuracy and cross checking with another similar product is not an option what do you do? I am not concerned with these two cases specifically(if you know how these two work don't hesitate to share). I just want to learn how accuracy of anything is characterized. When you coross check with something more accurate you sure need to cross check the more accurate device with another more accurate device.

This is a very valid and interesting question.
Although I ignore the exact methods that were used to determine -let's say- one ampere, I believe that they go back to basic theoretical physics, and then to the design of an ingenious method to demonstrate the theory.

For instance, take the speed of light. Astronomers for centuries had a very good idea that light travels very fast, based on astronomical observations, but the actual accepted value wasn't very accurate.
It wasn't until Leon Focault's famous experiment, later refined by Albert Michelson, where a reasonably accurate value was found in the mid-1800s . Utilizing modern equipment and instrumentation, this value has been further refined.

Although devising an experiment to define and determine the value of one ampere may not be as glamorous as determining the speed of light, I'm pretty sure that a similar method was followed.
 

in general, measurement accuracy depends on sampling rate for measured value and errors percentage of devices and components, when a multimeter can get and process more samples per seconds, so it gives you more accuracy.
as example, multi-meters depends on ADC, when this ADC have 10bit resolution, this mean it can measures voltage form 4.88mV, when it's resolution is 24bit, it can measures voltage from 0.000299mV.

regards,
 

I can't get my head around this. Let's say you produced the world's first multimeter and want to determine its accuracy what do you do? ..

Obviously you need test instruments that are far more accurate that the device you are testing. We usually insist on a test instrument that is at least 10 times more sensitive and accurate that the device you want to test.

If you have made the first multimeter, and you want to test its accuracy, you have several options:

1. Use standard cells (if you have access to) to test your multimeter are several discrete points.

2. Use lab setups (may be clumsy) to test the accuracy.

3. Use theoretical analysis to estimate errors.
 

Hi,

in general, measurement accuracy depends on sampling rate
I'm sorry. But this is wrong.
There is
* accuracy
* precision
* resolution
Resolution is the smallest step size you can decode.
Precision is "repeatability". It tells how much the output value fluctuates with a constant input signal.
Accuracy is how much the measured value differs from the true value.

You may sample with a rate of 1 per hour and get a better accuracy than sampling 1.000.000 times per second.

What you mean is that if you average a number of samples then the precision is better than using just a single sample.

Accuracy and precision is almost independent. You can have good precision and bad accuracy:
Example: true value = 5.000V. Measured values: 4.723V, 4.724V, 4.724V, 4.725V
--> precision: +/- 0.001V, accuracy 0.9448, which means 5.52% error.

Or you can have good accuracy with bad precision:
Example: true value = 5.000V, measured values: 5.087V, 4.945V, 4.996V, 5.028V

Resolution has about nothing to do with accuracy and/ or precision.

*****
About the original question:
A good source of information: https://en.m.wikipedia.org/wiki/Ampere
Usually every country has it's "bureau of standards". Usually they have exeptionally accurate tools to measure or to give defined physical values. High quality measurement devices can be calibrated according their standards. Usually they need to be recalibrated after a given time to ensure accuracy.


Klaus
 

This is a very valid and interesting question.
Although I ignore the exact methods that were used to determine -let's say- one ampere, I believe that they go back to basic theoretical physics, and then to the design of an ingenious method to demonstrate the theory.

For instance, take the speed of light. Astronomers for centuries had a very good idea that light travels very fast, based on astronomical observations, but the actual accepted value wasn't very accurate.
It wasn't until Leon Focault's famous experiment, later refined by Albert Michelson, where a reasonably accurate value was found in the mid-1800s . Utilizing modern equipment and instrumentation, this value has been further refined.

Although devising an experiment to define and determine the value of one ampere may not be as glamorous as determining the speed of light, I'm pretty sure that a similar method was followed.

I liked your speed of light measurement exampe. I think for every measurement there is some kind of spesific and sophisticated experiment associated with it.
 

I liked your speed of light measurement example. I think for every measurement there is some kind of specific and sophisticated experiment associated with it.

In reality, the matter is more complex than that. Speed of light is a const by definition today and the standard meter is defined in terms of the speed of light.

Speed of light involves time; the second is also defined in terms of another physical const (by definition) the hyperfine frequency of Cs isotope.

If you look at the table made by NIST (**broken link removed**) it is mentioned that the speed of light is exactly known.

Most reference clocks use either Cs or Rb.

Derived constants are different. Ampere will be defined in terms of Coulomb and sec. But I do not know what standards are commonly available for Coulomb.

Similar considerations apply for other derived units (like volt).
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top