#90: IRt/c's more accurate than conventional IR in real world

A common misunderstanding amongst even experien-ced manufacturers and users of infrared temperature measurement equipment is that the accuracy of temperature measurement is solely a specification of the infrared device. This statement is correct only in the laboratory under controlled conditions with blackbody (emissivity = 1,0, reflectivity = 0) targets. In the real world, designs and specifications that are applicable in the laboratory can be misleading and sometimes outright incorrect. Following is a summary of the key points of accuracy in a comparison between the IRt/c and conventional IR (detailed mathematical development is presented in Tech Note #89).


1. Real-world materials have emissivity < 1, and therefore have reflectivity > 0, which causes errors from varying background temperatures

Even non-metals, with emissivity approximately 0,9 and reflectivity approximately 0,1, must reflect about 10% of the energy incident from the background ambient. This reflected energy, unrelated to the target temperature, is nevertheless measured by the infrared sensor, and therefore will introduce significant errors if the ambient temperature changes, as it does in the real world. Conventional IR devices, calibrated with blackbodies, usually ignore this effect, and thus are subject to the error. IRt/c’s are specifically designed and tested to include a correction for this effect and thus improves their real-world accuracy.


2. Real-world materials have emissivities that change significantly with temperature, which causes significant errors even with perfect calibration and linearization

Probably the most misleading concept in infrared thermometry is that emissivity is constant with varying temperature. Real-world materials have emissivity variations that range from an average of 2% per 60°C (100°F) temperature change for nonmetals, to 10% per 60°C (100°F) temperature   change for some paints, and well over 100% per 60°C (100°F) temperature change for some metals. Accordingly, accuracy of real-world temperature measurements should be considered valid only for a limited temperature range. Specifications of accuracy for conventional infrared devices specified over wide target temperature ranges are largely meaningless. (The sole exception is Exergen’s D-Series, due to its Automatic Emissivity Compensation System.)

IRt/c’s are specifically designed and tested to maintain very high accuracy over a limited temperature range, and specifically not specified to imply accuracy over a wide target temperature range. For this reason IRt/c’s are offered in a variety of temperature range selections, each of which is optimized for a specific limited temperature range, which correctly reflects the realworld material characteristics, and maximizes the accuracy. 


3. Real-world temperature control is most accurate if the IR sensor is designed, built, calibrated, and tested at factory conditions that are designed to reflect actual field conditions

Conventional IR devices are designed, built, and tested to standards traditionally defined by blackbodies, which do not include the errors caused by reflected energy and emissivity variations of realworld materials.
IRt/c’s are designed, built, and tested to standards that include elevated and variable ambient background temperatures, and real-world target materials that change emissivity with temperature, thus maximizing the accuracy of temperature measurement and control.


Thermal Radiation Heat Transfer. R Siegel and JR Howell, ed., second edition, Hemisphere Publishing Corporation 1981.

Download this Tech Note as PDF