Accuracy vs Precision

Scientists often measure and predict things. Therefore, we need ways to describe how much we know, how close a number is to reality, and how likely we are to get the same number again. The terms accuracy and precision are generally used to describe these things, but there can be some ambiguity. This post explains the difference between the two and explores the different aspects of precision’s multiple meanings.

accuracy v precision
These targets illustrate different combinations of accuracy and precision. Accuracy describes the proximity to the desired answer (the bull’s eye). Precision describes the points’ proximity to each other. Less scatter is higher precision.

Accuracy refers to the correctness of a measurement or prediction. The results can vary a lot, but what matters is the difference between the measurements or predictions to what is considered to be the real or accepted value. Precision is often contrasted with accuracy by emphasizing the repeatability meaning of the word. This is most applicable to measurements, but can be applied to modelling too (e.g., stochastic models). When measuring something, we want some confidence that if we were to measure the same thing again, we would get a similar answer. In this situation, precision describes the degree of this similarity.

Accuracy and precision are both highly desirable characteristics for our measurements and predictions, but they are usually independent. For example, most models should produce the same results when based on the same algorithms and inputs, but this has nothing to do with the accuracy of the model’s predictions. Measurements can also be precise without being accurate. When this situation occurs, the difference between the measurements and the true value is called the bias (i.e., bias of an estimator). If we can quantify the bias, then it is possible to adjust for the bias to improve the accuracy.

Another meaning for precision creates some ambiguity when using the term. This second meaning of precision describes how much we know about a measurement. For example, rules of significant digits help us to use numbers that don’t express more than what we really know about a measurement. This definition of precision is mostly used to describe the exactness of measurements, but is sometimes used to describe the level of detail in spatial applications. For example, the popular term ‘precision agriculture’ is trying to emphasize the extra level of detail that the work can be done.

Confusion between precision and accuracy is fueled by the common mixing of the terms. Merriam-Webster defines precision as “the accuracy (as in binary or decimal places) with which a number can be represented.” Although it’s really a misuse of the term ‘accuracy’ to define a meaning of precision, they are trying to get at the concept of exactness. However, ambiguity exists in the scientific community too. The ISO has advocated defining accuracy to encompass both trueness and precision.

In order to avoid ambiguity, using alternative terms for precision may be a wise choice. When describing the ability to get similar results from multiple measurements, using the term ‘repeatability’ would be clearer in meaning. Precision, in terms of describing the quantitative or spatial detail of something, isn’t as easily replaced by a term such as ‘exactness’ because even that can conjure ideas of accuracy. As semantics evolve, maybe the meaning of precision will evolve to something more limited. In the meantime, when possible, we can use ‘detail’ and ‘resolution’ for spatial applications and ‘significant digits’ for quantitative applications.

For more, check out:

One thought on “Accuracy vs Precision

Leave a Reply

Loading...