What is Accuracy and Precision
Accuracy and Precision: Measurement is essential for us to understand the external world and through millions of years of life, we have developed a sense of measurement. Measurements require tools that provide scientists with a quantity. The problem here is that the result of every measurement by any measuring instrument contains some uncertainty. This uncertainty is referred to as an Error.
Accuracy and precision are two important factors to consider while taking measurements. Both these terms reflect how close a measurement is to a known or accepted value. In this article, let us learn in detail about accuracy and precision.
Definition of Accuracy
The ability of the instrument to measure the accurate value is known as accuracy. In other words, the closeness of the measured value to a standard or true value.
Alternatively, ISO defines accuracy as describing a combination of both types of observational error above (random and systematic), so high accuracy requires both high precision and high trueness.
Other Definitions of Accuracy
- Accuracy is obtained by taking small readings.
- The small reading reduces the error of the calculation. Accuracy is the degree of agreement between the experimental result and the true value..
- The accuracy can be obtained by taking the small readings. The small reading reduces the error of the calculation.
The formula of accuracy-:
Accuracy = No. of correct prediction / Total no. of prediction
The formula for binary classification-:
For binary classification, accuracy can also be calculated in terms of positives and negatives as follows:
Accuracy = TP + TN / TP + TN + FP + FN
Where TP = True Positives, TN = True Negatives, FP = False Positives, and FN = False Negatives.
Classifications of Accuracy
The accuracy of the system is classified into three types as follows:
1. Point Accuracy
- The accuracy of the instrument only at a particular point on its scale is known as point accuracy.
- It is important to note that this accuracy does not give any information about the general accuracy of the instrument.
2. Accuracy as Percentage of Scale Range
The uniform scale range determines the accuracy of a measurement. This can be better understood with the help of the following example:
Consider a thermometer having a scale range up to 500ºC. The thermometer has an accuracy of ±0.5, i.e. ±0.5 percent of increase or decrease in the value of the instrument is negligible. But if the reading is more or less than 0.5ºC, it is considered a high-value error.
3. Accuracy as Percentage of True Value
- Such type of accuracy of the instruments is determined by identifying the measured value regarding their true value.
- The accuracy of the instruments is neglected up to ±0.5 percent from the true value.
Definition of Precision
Precision is the degree of agreement among a series of measurements of the same quantity. It is a measure of the reproducibility of results rather than their correctness.
Other Definitions of Precision
- The closeness of two or more measurements to each other is known as the precision of a substance.
- If you weigh a given substance five times and get 3.2 kg each time, then your measurement is very precise but not necessarily accurate. Precision is independent of accuracy.
- The value of precision differs because of the observational error.
Formula of precision
Precision = TP/TP+FP
Where TP = True Positives, FP = False Positives.
Advantage of Precision
- The precision is used for finding the consistency or reproducibility of the measurement.
Characteristic of Precision
- The conformity and the number of significant figures are the characteristics of the precision.
High Precision and low Precision value
- The high precision means the result of the measurements are consistent or the repeated values of the reading are obtained.
- The low precision means the value of the measurement varies.
- But it is not necessary that the highly precise reading gives the accurate result.
Example of Precision
Consider the 100V, 101V, 102V, 103V, and 105V are the different readings of the voltages taken by the voltmeter. The readings are nearly close to each other. They are not exactly the same because of the error. But as the reading are close to each other then we say that the readings are precise.
Classifications of Precision
Precision is sometimes separated into two parts :
- The variation arising when the conditions are kept identical and repeated measurements are taken during a short time period.
- Repeatability is defined as the closeness of agreement between independent test results, obtained with the same method, on the same test material, in the same laboratory, by the same operator, and using the same equipment within short intervals of time.
- Repeatability or test-retest reliability is the closeness of the agreement between the results of successive measurements of the same measure when carried out under the same conditions of measurement.
- In other words, the measurements are taken by a single person or instrument on the same item, under the same conditions, and in a short period of time. A less than perfect test-retest reliability causes test-retest variability.
- Such variability can be caused by, for example, intra-individual variability and inter-observer variability. A measurement may be said to be repeatable when this variation is smaller than a pre-determined acceptance criterion.
- Repeatability practices were introduced by scientists Bland and Altman.
- For repeatability to be established, the following conditions must be in place: the same location; the same measurement procedure, the same observer, the same measuring instrument, used under the same conditions; and repetition over a short period of time.
- What’s known as “the repeatability coefficient” is a measurement of precision, which denotes the absolute difference between a pair of repeated test results.
- Repeatability is the ability to obtain the same result when measuring the same quantity several times in succession. The turbine meter’s VMFt has a required repeatability range of 0.05%. The orifice meter’s MMFo has a required repeatability range of 0.10%.
- Repeatability has been expressed as a percentage on each orifice proving report.
- The variation arises using the same measurement process among different instruments and operators, and over longer time periods.
- Reproducibility, on the other hand, refers to the degree of agreement between the results of experiments conducted by different individuals, at different locations, with different instruments. Put simply, it measures our ability to replicate the findings of others. Through their extensive research, controlled interlaboratory test programs are able to determine reproducibility.
- Reproducibility is the ability to obtain the same result when measuring the same quantity under different conditions (i.e. time, observer, operating conditions, etc.).
- With respect to the turbine meter, reproducibility is not significant, since the meter is proved immediately prior to proving an orifice meter. The orifice meter’s MMFo reproducibility was demonstrated over a time period of several days at different flow rates and was generally within 0.10%.
Overall Examples of Accuracy and Precision
Here are some examples of accuracy and precision
A good analogy for understanding accuracy and precision is to imagine a football player shooting at the goal. If the player shoots into the goal, he is said to be accurate. A football player who keeps striking the same goalpost is precise but not accurate.
Therefore, a football player can be accurate without being precise if he hits the ball all over the place but still scores. A precise player will hit the ball to the same spot repeatedly, irrespective of whether he scores or not.
A precise and accurate football player will not only aim at a single spot but also score the goal.
If the weather temperature reads 28 °C outside and it is 28 °C outside, then the measurement is said to be accurate. And if the thermometer continuously registers the same temperature for several days then the measurement is also precise.
If you take the measurement of the mass of a body of 20 kg and you get 17.4,17,17.3 and 17.1, your weighing scale is precise but not very accurate.
And if your scale gives you values of 19.8, 20.5, 21.0, and 19.6, it is more accurate than the first balance but not very precise.
For example, if in the lab you obtain a weight measurement of 3.2 kg for a given substance, but the actual or known weight is 10 kg, then your measurement is not accurate. In this case, your measurement is not close to the known value. Precision refers to the closeness of two or more measurements to each other.
Accurate and Precise- If a weather thermometer reads 75oF outside and it really is 75oF, the thermometer is accurate. If the thermometer consistently registers the exact temperature for several days in a row, the thermometer is also precise.
Precise, but not Accurate: A refrigerator thermometer is read ten times and registers degrees Celsius as: 39.1, 39.4, 39.1, 39.2, 39.1, 39.2, 39.1, 39.1, 39.4, and 39.1. However, the real temperature inside the refrigerator is 37 degrees C. The thermometer isn’t accurate (it’s almost two degrees off the true value) but as the numbers are all close to 39.2, so it is precise.