Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

what's the differences between Accuracy and Precision?

Status
Not open for further replies.

ZengLei

Full Member level 1
Joined
Jan 24, 2006
Messages
99
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,286
Location
WuHan China PR
Activity points
2,143
A Instrument for measurement has 2 parameters:

Accuracy & Precision

what's their differences???
 

I´m not quite sure, but I guess "accuracy" is more related to the reading and "precision" to the ability of the device to measure the parameter.
 

if result is accurate, it must be closer to actual result even if there was error in getting readings.

if result is precise, all readings must be pretty close to each other ie no error when getting readings.
 

    ZengLei

    Points: 2
    Helpful Answer Positive Rating
Hi
in a manufactory Accuracy illustrate the error correctness of products and Precision in mesuring a fix things illustrate the amount of error in scattering of mesurments or sampels.
Regards
 

Accuracy :- It is the limit of error of an instrument. This is normally given in terms of numbers called accuracy classes (or indices) e.g. 0.05, 0.1, 1.0, 1.5 etc. The figure is given on the face of the instrument. An instrument with a class index of 1.0 will measure with an error of 1% for each division of the scale.

Precision :- Ability of an instrument to reproduce the same reading when done several times for the same value of measured quantity.

An instrument can be precise but not accurate.

[/b]
 

if you measure a variable n times and get identical results it is caleed accuracy but if the results are the same as reality you have percesion
 

mdanaie said:
if you measure a variable n times and get identical results it is caleed accuracy but if the results are the same as reality you have percesion

It is other way arround.

Accuracy is describing how low error instrument will have from real value.
Precision is describing how tightly grouped measurement results are arround some center value.

For example, good instrument when not calibrated can give measurement result that is inacurate,e.g. -5% ,but all measurements will be tightly grouped arround inaccurate value ,e.g. ±0.05% . Once calibrated, instrument will give accurate measurements that are within ±0.05% error.
In contrast, bad instrument that has precision of ±5% will have ±5% error even when it is calibrated.

So in lamer terms, you could understand precision as value of how good instrument is, while accuracy is just matter of calibration. Sort of like, analog clock that is broken is most accurate clock. It is accurate two times a day, while with working clock that is absolutelly precise, it is never as accurate as broken clock in those two moments a day. :)
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top