Hi,
in general, measurement accuracy depends on sampling rate
I'm sorry. But this is wrong.
There is
* accuracy
* precision
* resolution
Resolution is the smallest step size you can decode.
Precision is "repeatability". It tells how much the output value fluctuates with a constant input signal.
Accuracy is how much the measured value differs from the true value.
You may sample with a rate of 1 per hour and get a better accuracy than sampling 1.000.000 times per second.
What you mean is that if you average a number of samples then the precision is better than using just a single sample.
Accuracy and precision is almost independent. You can have good precision and bad accuracy:
Example: true value = 5.000V. Measured values: 4.723V, 4.724V, 4.724V, 4.725V
--> precision: +/- 0.001V, accuracy 0.9448, which means 5.52% error.
Or you can have good accuracy with bad precision:
Example: true value = 5.000V, measured values: 5.087V, 4.945V, 4.996V, 5.028V
Resolution has about nothing to do with accuracy and/ or precision.
*****
About the original question:
A good source of information:
https://en.m.wikipedia.org/wiki/Ampere
Usually every country has it's "bureau of standards". Usually they have exeptionally accurate tools to measure or to give defined physical values. High quality measurement devices can be calibrated according their standards. Usually they need to be recalibrated after a given time to ensure accuracy.
Klaus