Why are the CO2 Levels Different Between 2 Devices?

K30 CO2 Sensor

A customer recently wrote us and asked, "I am testing a SenseAir K-30 CO2 Sensor next to a desktop CO2 meter. The K-30 is reading 778ppm, while the desktop meter shows 846ppm. Which one is correct?" The simple answer is that they are both correct! Yes, they are different, but both sensors are working within their rated accuracy. What does that mean?

How is CO2 sensor accuracy defined?

The accuracy of a CO2 sensor is defined as how close the measurement is to a reference gas, expressed as either a ± (plus-minus) value in parts-per-million (ppm) or as a percentage (%) of the measured value, or a combination of both. For example, if I have a tank of CO2 known to be exactly 10,000ppm and I test dozens of sensors hundreds of times to discover that all the readings are between 9,900ppm and 10,100ppm, I can report that the sensor's accuracy is ± 100ppm or 2% (200/10,000). This is how you'll typically see accuracy listed for a CO2 sensor.

In other words, accuracy is determined by repeatedly testing a sensor against a known reference gas, and recording the range of values reported. The wider the range, the lower the accuracy. The testing is first done by the manufacturer at the factory when the sensor is designed. During production, sensors are calibrated against a reference gas, then ran for a period of time. If they perform within the pre-determined accuracy parameters, they are considered "accurate" and shipped.

What can change CO2 sensor accuracy?

Keep in mind that a sensor's accuracy changes at different CO2 levels. A 10,000ppm CO2 sensor might have it's accuracy specified from 0-2,000ppm, then a different accuracy is specified (or not listed) between 2,001ppm and 10,000ppm. This is most common for sensors designed for indoor air quality, where most CO2 level measurements will be between 400ppm (fresh air) and 1,200ppm (stale air).

In addition, the smaller the range of CO2 levels to measure, typically the greater the accuracy. A 10,000ppm (1% CO2) sensor will typically be more accurate than a 100,000ppm (10% CO2) sensor, which will be more accurate than a 100% CO2 sensor. This means that for greatest accuracy, you should use a CO2 sensor with the smallest range that still fits the intended purpose.

Accuracy also changes if there are changes in temperature, pressure, or humidity. Therefore, unless otherwise stated, you can assume the accuracy was measured at the factory at SATP (standard ambient temperature and pressure).

So which one is correct?

According to the specifications, the SenseAir K-30 sensor has an accuracy of ±30ppm or ±3%, whichever is greater. At a reading of 778ppm, the actual amount of CO2 could be between 748-808ppm (30ppm > 3% of 778ppm). The TIM10 CO2 meter has an accuracy of ±50ppm and ±5%, so at a reading of 846ppm, the actual amount of CO2 could be between 796-896ppm (50ppm > 5% of 846ppm). Since the 2 sensors accuracy ratings overlap between 796ppm and 808ppm, you can say they are both correct within their rated accuracy.

While this sounds like theory instead of science, it works in the real world. If you were to measure CO2 levels with 10 of the most accurate CO2 sensors available, they would all report slightly different results from one reading to the next. That's why at the parts-per-million level, no CO2 sensor can ever be perfectly accurate.

However, from the accuracy ratings on the datasheets, we can see that the K-30 sensor is more accurate than the TIM10. Based on this information alone, I would say that while both devices are accurate, the K-30's CO2 readings over time is closer to the actual CO2 level.

Want to know more? We have an app note here.

Older Post Newer Post