Precision and Accuracy are often thought of in the same way, as being basically the same thing. While this is true in a general matter, they are to different things when it comes to data and calculation.

Accuracy is the proximity of measurement results to the true value, whereas precision is the repeatability or reproducibility of the measurement.

To put this into simpler terms, let’s use a diagram.

Here,we see four diagrams of a shooting target, each labeled with the accuracy and precision level of the shots taken.

Now, if you look at the top right, it’s obvious that none of the shots are accurate or precise. The shots are all over the place and not consistent. In the top right we see our first example of *precision without accuracy**. *In this, the shots or measurements are consistent, but are not near the *true value*, or in this case, the bullseye(center of the target).

Looking at the bottom left, we now see *accuracy without precision.* Notice how the shots are close to the center, but are not in a consistent spot and are scattered around the center. And on the bottom right we see *precision and accuracy. *All the shots are close or at the center(true value) and are consistent, keeping a tight grouping of measurements.

Precision and Accuracy become very important factors, and lead us into the subject of significant digits. Significant digits are the amount of numbers starting with the first non-zero number. For example, 0.0124 has three significant digits. This is important in that to be precise in our findings, we must retain the same amount of significant digits in our data. As far as being accurate, more often than not, significant digits can play a key role. Say our true value is 0.07564, and our measurement is recorded at 0.07554. Now, if we round this data to 0.076, that isn’t necessarily incorrect. However, by having the same amount of significant digits as our true value, our data is not rounded as high or at all and thus becomes more accurate.

### Like this:

Like Loading...

*Related*