High compression diesel automobile engines are popular worldwide because of their high fuel mileage. The problem with using diesel fuel is the amount of nitrogen oxides (NOx) they expel. To reduce NOx, automotive manufactures use cooled exhaust gas recirculation (EGR) to reduce combustion temperatures. However, cooler temperatures result in higher levels of particulate matter (PM).
This “NOx-PM tradeoff” was studied using our K-33 CO2 sensor in a paper titled, Emission Reduction and Assisted Combustion Strategies for Compression Ignition Engines with Subsequent Testing on a Single-Cylinder Engine by J. Colter Ragone at the Graduate Faculty of the University of Kansas.
Ragnone used the K-33 ICB CO2 sensor on a hand-built single-cylinder test engine to precisely compare the amount of CO2 between the engine intake and exhaust. His research confirmed that as the amount of EGR increases, the amount of CO2 and H2O returning to the intake increases, lower oxygen levels result in reduced combustion efficiency.
By being able to accurately monitor gas inputs and outputs, Ragnone was then able to test several theories designed to reduce the NOx-PM tradeoff including the addition of ozone to the intake gas, changes in ignition timing, and modifications to the intake gas mixing chamber.
This test engine setup will help future engineers as they attempt to further the knowledge of efficient engine design.