Zurück zur Übersicht

What is difference between accuracy, precision, resolution and sensitivity?

Manufacturers of digital instruments usually give specifications for their instruments that define its accuracy, precision, resolution and sensitivity. Unfortunately, not all of these specifications are uniform from one to another or interpretated in the same way. Moreover, even when they are given, understanding how they apply to system and to the variables which are measured is not clear.

Accuracy
Accuracy gives information how close a measured value can deviate from the true value of measured signal. It can be defined as the amount of uncertainty in a measurement with respect to an accurate value. Accuracy specifications usually means the error entered by instrument and it is shown in percentage %. For example if accuracy is specified like 3%, and 100V is measured, the instrument can show value between 97V to 103V on display.

Specified accuracy is normally given for normal environmental conditions like room temperature, normal humidity etc. It is normally defined for particular part of range. For example if display range is 0V to 1000V, accuracy can be given for range 5V to 1000V.

Precision
Precision describes the reproducibility of a measurement. Another term used for precision is repeatability. For example, when a steady state signal is measures many times. In this case if the values are close together then it has a high degree of precision or repeatability.

An accurate measurement doesn’t need to be precise, and in the other hand, precision doesn‘t mean accuracy. For example when a steady voltage of 100V is measured and when successive measurement readings on the display are 93.0V, 93.1V, 92.8V... it means that the unit is precise but not accurate. On the other hand, if value on the display vary like 97.1V, 102.9V, 100.0V... it means that the unit is accurate but not precise.

Resolution
Resolution is the smallest increment which an instrument can show on the display. It means the number of significant digits (decimal places) to which a measured value is being shown on display. If the resolution of a instrument is 0.1V, the voltage 10V will be shown as 10.0.

Resolution represent the ratio between the maximum value of a signal to the smallest part that can be displayed. It sometimes can be interpreted as the degree to which a change can be theoretically detected and can be related to number of bits of the AD converter used for the measurement.

A technique called averaging can improve the resolution, but it sacrifices speed. Averaging means multiple readings are added together and then divided by the total number of samples. However, this technique cannot reduce the effects of non-linearity, noise, etc.

Counts and digits are terms which are sometimes used to describe a instruments resolution.

Accuracy of an instrument can include some amount of digits too. For example accuracy can be specified like (3% + 3digit). Then if 100V is measured while the resolution is 0.1V, the instrument will be accurate if shows  values between 96.7V to 103.3V on the display.

Sensitivity
Sensitivity is an absolute quantity, the smallest absolute amount of change that can be detected by an instrument. It is defined as the ratio of the changes on the display of an instrument to a change in the value of the quantity being measured. It denotes the smallest change in the measured variable to which the instrument responds to. A higher sensitivity indicates that the system can respond to even the smallest input. But sensitivity of an instrument sometimes can depend on environment conditions such as noise etc.

Sensitivity should be distinguished from resolution, and normally can be the same or lower than the resolution. For example, resolution of the display can be 0.1V, but the unit can detect change of voltage which is 0.2V.