6 Things You Should Know About Calibration in 2023
Updated: Feb 25
Systematic Process description and documentation, Historical development
Calibration is a process that assures the accuracy of an instrument through systematic comparisons between two items, one of which is a recognized standard measure. It provides a means of compensating for errors in measuring instruments and/or variations in environmental conditions. The word calibration comes from the Latin word "calibrare", which means to “bring into balance” or “to make equal weights equal”.
Modern calibration processes
The early calibration of pressure instruments was performed by the "gauge method." This involved setting up a known pressure on an instrument and then measuring the response (using a gauge ). The accuracy of this method depended on two things: how well you could control the source of pressure, and whether your instrument responded linearly to changes in source pressure.
The modern calibration process is much more sophisticated than this. The first thing we need to do is understand that there are two types of errors in any measurement system: random error (also called statistical noise), which has no predictable pattern; and systematic error, which does have a predictable pattern. If you know what these patterns look like, you can take steps to minimize them during calibration.
The quality of the calibration depends on the quality of the standard. The standard must be traceable to a national or international standard. It must also be kept in a controlled environment to prevent damage and environmental change, such as humidity, pressure, or temperature.
The standard may come from different suppliers and may have multiple sources of error. For example, if you are calibrating your pH meter with phenolphthalein and sodium hydroxide solutions (as opposed to universal buffer solutions) then you will need to follow specific instructions for calibrating this particular instrument based on how it was made.
Frequency, Standards required, and accuracy
The frequency of calibration depends on the application and accuracy required. A high-accuracy instrument may be calibrated at shorter intervals than a low-accuracy one, depending on its usage.
The accuracy of calibration is also dependent on the size of the instrument and its application. For example, if you are using a small angle goniometer to measure angles in Milli radians (0.001), then it will probably need to be calibrated more frequently than an inclinometer which measures angles in degrees or grads (100). This is because smaller instruments are more likely to show wear over time, losing their accuracy due to friction between components or changes within their housing.
Manual and automatic calibrations
There are two types of calibrations: manual and automatic.
The most common calibration process is manual, which is done by comparing the instrument to a known standard. The technician may use either an internal or external standard that has been certified by the NIST (National Institute of Standards and Technology).
The automatic calibration uses a computer to compare your instrument's output against a set point. It's more accurate than the manual process because it uses three points instead of just two (more on this below).
Calibration of weights and distances, The early calibration of pressure instruments
In the early days of calibration, weights and distances were calibrated by comparing them against known values. Weights were weighed on a scale that had been calibrated using a set of weights; the distance was measured using a rod that had been compared to another rod of known length.
The method used to calibrate pressure instruments has not changed much over the years. A standard (known) pressure difference between two points is applied, and then the instrument is checked against this standard for accuracy.
It’s important to remember that calibration is not the same as testing, which is the process of determining the accuracy of an instrument. This can be done by comparing it against another item, or by using a standard measurement method. In either case, calibration ensures that your instrument is performing as expected when used in its intended application.