+ Post New Thread
Results 1 to 7 of 7
  1. #1
    Junior Member level 3
    Points: 250, Level: 2

    Join Date
    May 2016
    Posts
    30
    Helped
    0 / 0
    Points
    250
    Level
    2

    Accuracy of a measurement device.

    I can't get my head around this. Let's say you produced the world's first multimeter and want to determine its accuracy what do you do? Or consider this scenario, you manufactured an RF signal generator that you want to measure its amplitude accuracy and cross checking with another similar product is not an option what do you do? I am not concerned with these two cases specifically(if you know how these two work don't hesitate to share). I just want to learn how accuracy of anything is characterized. When you coross check with something more accurate you sure need to cross check the more accurate device with another more accurate device.

    •   Alt4th January 2017, 20:55

      advertising

        
       

  2. #2
    Advanced Member level 5
    Points: 10,309, Level: 24
    schmitt trigger's Avatar
    Join Date
    Apr 2013
    Location
    C4E4DEEEDCE6ECD2
    Posts
    1,763
    Helped
    562 / 562
    Points
    10,309
    Level
    24

    Re: Accuracy of a measurement device.

    Quote Originally Posted by Palpurul View Post
    I can't get my head around this. Let's say you produced the world's first multimeter and want to determine its accuracy what do you do? Or consider this scenario, you manufactured an RF signal generator that you want to measure its amplitude accuracy and cross checking with another similar product is not an option what do you do? I am not concerned with these two cases specifically(if you know how these two work don't hesitate to share). I just want to learn how accuracy of anything is characterized. When you coross check with something more accurate you sure need to cross check the more accurate device with another more accurate device.
    This is a very valid and interesting question.
    Although I ignore the exact methods that were used to determine -let's say- one ampere, I believe that they go back to basic theoretical physics, and then to the design of an ingenious method to demonstrate the theory.

    For instance, take the speed of light. Astronomers for centuries had a very good idea that light travels very fast, based on astronomical observations, but the actual accepted value wasn't very accurate.
    It wasn't until Leon Focault's famous experiment, later refined by Albert Michelson, where a reasonably accurate value was found in the mid-1800s . Utilizing modern equipment and instrumentation, this value has been further refined.

    Although devising an experiment to define and determine the value of one ampere may not be as glamorous as determining the speed of light, I'm pretty sure that a similar method was followed.
    My batteries are recharged by "Helpful Post" ratings.
    If you feel that I've helped you, please indicate it as a Helpful Post

  3. #3
    Junior Member level 3
    Points: 305, Level: 3

    Join Date
    Mar 2016
    Posts
    29
    Helped
    4 / 4
    Points
    305
    Level
    3

    Re: Accuracy of a measurement device.

    in general, measurement accuracy depends on sampling rate for measured value and errors percentage of devices and components, when a multimeter can get and process more samples per seconds, so it gives you more accuracy.
    as example, multi-meters depends on ADC, when this ADC have 10bit resolution, this mean it can measures voltage form 4.88mV, when it's resolution is 24bit, it can measures voltage from 0.000299mV.

    regards,
    To do it right, just read, read again, do it.

    •   Alt4th January 2017, 22:08

      advertising

        
       

  4. #4
    Advanced Member level 5
    Points: 9,335, Level: 23

    Join Date
    Nov 2012
    Posts
    1,756
    Helped
    377 / 377
    Points
    9,335
    Level
    23

    Re: Accuracy of a measurement device.

    Quote Originally Posted by Palpurul View Post
    I can't get my head around this. Let's say you produced the world's first multimeter and want to determine its accuracy what do you do? ..
    Obviously you need test instruments that are far more accurate that the device you are testing. We usually insist on a test instrument that is at least 10 times more sensitive and accurate that the device you want to test.

    If you have made the first multimeter, and you want to test its accuracy, you have several options:

    1. Use standard cells (if you have access to) to test your multimeter are several discrete points.

    2. Use lab setups (may be clumsy) to test the accuracy.

    3. Use theoretical analysis to estimate errors.

  5. #5
    Super Moderator
    Points: 36,371, Level: 46
    Awards:
    Most Frequent Poster

    Join Date
    Apr 2014
    Posts
    7,422
    Helped
    1803 / 1803
    Points
    36,371
    Level
    46

    Re: Accuracy of a measurement device.

    Hi,

    in general, measurement accuracy depends on sampling rate
    I'm sorry. But this is wrong.
    There is
    * accuracy
    * precision
    * resolution
    Resolution is the smallest step size you can decode.
    Precision is "repeatability". It tells how much the output value fluctuates with a constant input signal.
    Accuracy is how much the measured value differs from the true value.

    You may sample with a rate of 1 per hour and get a better accuracy than sampling 1.000.000 times per second.

    What you mean is that if you average a number of samples then the precision is better than using just a single sample.

    Accuracy and precision is almost independent. You can have good precision and bad accuracy:
    Example: true value = 5.000V. Measured values: 4.723V, 4.724V, 4.724V, 4.725V
    --> precision: +/- 0.001V, accuracy 0.9448, which means 5.52% error.

    Or you can have good accuracy with bad precision:
    Example: true value = 5.000V, measured values: 5.087V, 4.945V, 4.996V, 5.028V

    Resolution has about nothing to do with accuracy and/ or precision.

    *****
    About the original question:
    A good source of information: https://en.m.wikipedia.org/wiki/Ampere
    Usually every country has it's "bureau of standards". Usually they have exeptionally accurate tools to measure or to give defined physical values. High quality measurement devices can be calibrated according their standards. Usually they need to be recalibrated after a given time to ensure accuracy.


    Klaus

    •   Alt5th January 2017, 17:23

      advertising

        
       

  6. #6
    Junior Member level 3
    Points: 250, Level: 2

    Join Date
    May 2016
    Posts
    30
    Helped
    0 / 0
    Points
    250
    Level
    2

    Re: Accuracy of a measurement device.

    Quote Originally Posted by schmitt trigger View Post
    This is a very valid and interesting question.
    Although I ignore the exact methods that were used to determine -let's say- one ampere, I believe that they go back to basic theoretical physics, and then to the design of an ingenious method to demonstrate the theory.

    For instance, take the speed of light. Astronomers for centuries had a very good idea that light travels very fast, based on astronomical observations, but the actual accepted value wasn't very accurate.
    It wasn't until Leon Focault's famous experiment, later refined by Albert Michelson, where a reasonably accurate value was found in the mid-1800s . Utilizing modern equipment and instrumentation, this value has been further refined.

    Although devising an experiment to define and determine the value of one ampere may not be as glamorous as determining the speed of light, I'm pretty sure that a similar method was followed.
    I liked your speed of light measurement exampe. I think for every measurement there is some kind of spesific and sophisticated experiment associated with it.

  7. #7
    Advanced Member level 5
    Points: 9,335, Level: 23

    Join Date
    Nov 2012
    Posts
    1,756
    Helped
    377 / 377
    Points
    9,335
    Level
    23

    Re: Accuracy of a measurement device.

    Quote Originally Posted by Palpurul View Post
    I liked your speed of light measurement example. I think for every measurement there is some kind of specific and sophisticated experiment associated with it.
    In reality, the matter is more complex than that. Speed of light is a const by definition today and the standard meter is defined in terms of the speed of light.

    Speed of light involves time; the second is also defined in terms of another physical const (by definition) the hyperfine frequency of Cs isotope.

    If you look at the table made by NIST (https://www.nist.gov/sites/default/f...c/wall2014.pdf) it is mentioned that the speed of light is exactly known.

    Most reference clocks use either Cs or Rb.

    Derived constants are different. Ampere will be defined in terms of Coulomb and sec. But I do not know what standards are commonly available for Coulomb.

    Similar considerations apply for other derived units (like volt).

+ Post New Thread
Please login
--[[ ]]--