Why do we need to calibrate sensors?

There are a lot of good sensors these days and many are 'good enough' out of the box for many non-critical applications.  But in order to achieve the best possible accuracy, a sensor should be calibrated in the system where it will be used.  This is because: 

  • No sensor is perfect.
    • Sample to sample manufacturing variations mean that even two sensors from the same manufacturer production run may yield slightly different readings.
    • Differences in sensor design mean two different sensors may respond differently in similar conditions. This is especially true of ‘indirect’ sensors that calculate a measurement based on one or more actual measurements of some different, but related parameter.
    • Sensors subject to heat, cold, shock, humidity etc. during storage, shipment and/or assembly may show a change in response.
    • Some sensor technologies 'age' and their response will naturally change over time - requiring periodic re-calibration.
  • The Sensor is only one component in the measurement system.  For example:
    • With analog sensors, your ADC is part of the measurement system and subject to variability as well.
    • Temperature measurements are subject to thermal gradients between the sensor and the measurement point.
    • Light and color sensors can be affected by spectral distribution, ambient light, specular reflections and other optical phenomena.
    • Inertial sensors almost always have some 'zero offset' error and are sensitive to alignment with the system being measured
What makes a good sensor?

The two most important characteristic of a sensor are:

  • Precision - The ideal sensor will always produce the same output for the same input.  
  • Resolution - A good sensor will be able to reliably detect small changes in the measured parameter.
What affects precision?
  • Noise - All measurement systems are subject to random noise to some degree.  Measurement systems with a low Signal to Noise Ratio will have problems making repeatable measurements.  In the diagrams above, the sensor on the right shows much better precision than the noisy one on the left.
  • Hysteresis - Some types of sensors also exhibit hysteresis.  The sensor will tend to read low with an increasing signal and high with a decreasing signal as shown in the graph below.  Hysteresis is a common problem with many pressure sensors.

To paraphrase George Santayana:  

"Those who ingnore hysteresis are doomed to unrepeatable results."

Are there any other important qualities in a sensor?

Precision and resolution are the real 'must have' qualities.  But there are a couple of other 'nice-to-have' qualities:

Linearity - A sensor whose output is directly proportional to the input is said to be linear.  This eliminates the need to do any complex curve-fitting and simplifies the calibration process.

Speed - All else being equal, a sensor that can produce precise readings faster is a good thing to have.

What about accuracy? Isn't accuracy the most important thing?

Accuracy is a combination of precision, resolution and calibration.  If you have a sensor that gives you repeatable measurements with good resolution, you can calibrate it for accuracy.

What about digital sensors? Aren't they calibrated at the factory?

To some degree, yes.  Nevertheless, digital sensors are still subject to manufacturing and operating condition variability.  For critical measurements, you need to calibrate the total system.

But the manufacturer's spec sheet says it is accurate to 0.00000001%

And it probably is - when measured in their QA test fixture using their test procedures and according to their definition of 'accuracy'.  

Your mileage may vary!

This guide was first published on May 18, 2015. It was last updated on Mar 08, 2024.

This page (Why Calibrate?) was last updated on Mar 08, 2024.

Text editor powered by tinymce.