Accuracy – Concept, examples, measuring instruments, accuracy

We explain what precision is and why it is important in measurements. Also, examples and differences with the accuracy.

That which is precise constantly gets correct results.

What is precision?

In general, when we talk about the precision of something or someone, we refer to their short-term ability to hit the target, that is, to obtain the expected results or results very close to the expected. Although in everyday speech it can be synonymous with accuracy, it is not advisable to confuse these two terms.

The word precision comes from Latin praecisionis, derived from the verb praecidere, which can be translated as “cut well”, “cut at two ends” or “separate completely by cutting what is left over.” This verb was made up of the voices prae- (“Forward” or “in advance”) and caedere (“Cut” or sometimes “kill”).

Originally this word was used to refer to what has been cut or severed from the body (eunuchs, for example, were called praecisus, “chopped up”); while its current meaning comes from its figurative application in rhetoric, that is, in relation to oratory.

There, praecisus he was referring to what was “well cut”, that is, well defined, well focused, and therefore adhering to the subject in question in the best way. In other words, that which is pertinent, that adheres to what is necessary.

Thus today we refer to precision as the ability to hit the target or close to the target in different attempts. For example, a darts player has three chances to throw them at the bullseye, and once he has done so, he can judge how close to the center his throws were and how accurate he was.

This kind of value can be of great importance in the field of scientific disciplines, engineering or statistics.

Accuracy in measuring instruments

The measuring instruments are the tools and devices that allow us to express a magnitude in numerical values determined of nature. These measurements can be more or less precise, that is, contain a certain margin of error attributable to contextual and unpredictable factors. Thus, a set of measurements can vary from one another, despite the fact that it is the same magnitude that is being measured.

Let’s imagine, by way of example, that with a thermometer we take our body temperature, and we do it several times to be sure that there is no inadvertent error. If we notice that the measurements all tend to the real value of the temperature (or in any case to the estimated value), we will know that it is an accurate thermometer, that is, that it registers its values ​​quite accurately.

That is an instrument that always tends to measure correctly is accurate. On the other hand, if the temperature varies immensely between one measurement and the other, we must understand that the thermometer has lost its necessary precision, since some measurements will be closer to the real thing and others will instead be far away from it. And how do you know which is which?

Examples of precision

As an example, we can visualize some cases in which precision is a determining factor:

  • Every batter in a professional league baseball has a “averageBatting average, or his performance at bat. This average is a numerical approximation of his accuracy at bat, that is, of how many times he bats of all those that correspond to him in a game.
  • A soldier exercises for war and fires the 100-round cartridge from his rifle at a target. Then you go and check the amount of impacts on the doll, and you can get an estimated idea of ​​its accuracy, that is, of how many shots hit the target or were they close to doing so, and how many shots were missed.
  • During a medieval siege, the operators of a catapult attempt to throw stones against the walls enemies. But the catapult is not very well calibrated, and each rock they launch follows a different trajectory: some will hit the walls, others the nearby river, others the battlefield where they crush the allied troops. Logically, it is a very inaccurate catapult, since its shots do not tend to hit where it was aimed.

Precision and accuracy

In science, engineering and statistics, it is important to distinguish the notion of precision from that of accuracy, even though in everyday speech they are often used as synonymous words. This difference is particularly important when understanding or interpreting the results obtained during a measurement, and depends on the following:

  • PrecisionAs we have seen, it is determined by the ability of an instrument or a measurement technique to record similar values ​​in a number of successive measurements, since they can vary from one another depending on the margin of error. The closer the measurements, the greater the accuracy of the device.
  • The accuracy instead it has to do with the closeness of the measurements with respect to the expected value or the real value. In other words, how close a measurement is to reality. The closer to the expected or actual data, the more accurate the instrument will be.

This difference can be easily understood with an example: Suppose a golfer tries to make a hole in one to break a record. Although he is a good golfer, there are variables that influence his shots: the wind, the humidity, the perfection of the golf ball or the force that he puts into the shot; so you will have to try many times until you finally achieve it.

If we judge how close to the hole his balls have landed, we will find the measure of its accuracy, since we know that the reference value is the hole itself. On the other hand, if we look at the number of times his shots came close to the hole, against the total number of attempts made, we can find his accuracy, that is, what margin of error his shots have in general.