Article Archive
Verifying an Infrared Imager's Calibration
G. Raymond Peacock
Temperatures.com, Inc.
Southampton, PA 18966
Abstract
A brief walk through of why and how an individual can verify the radiometric (temperature) calibration of their infrared thermal imager and why some popular myths are untrue. There are three main obstacles to doing verification on a regular basis.
First, the equipment suppliers seldom recommend that you do it. Second, very few users have learned how to do it successfully. Finally, the uncertain but clever user can often rationalize their way out of the work by adopting popular myths like: “If the equipment vendors wanted me to do it, they would have told me how”; or “Even if it is not to specifications, I can’t do anything about it”; and the most popular, “I don’t need to know, I only look for temperature differences”.
This presentation will provide a more balanced and rational view of calibration and instrument performance plus provide some methods for verifying the temperature calibration of your imager and how to track it over time. Your confidence in your equipment will be more solidly based once you begin to collect real numbers instead of myths.
Introduction
In 2005-2006, I conducted an informal survey of nearly all the world’s makers of quantitative infrared imagers and imager systems. The results were presented as a technical paper at ThermoSense XXVIII in April 2006 and published in the Proceedings of that meeting1.
Displayed in the following figure is one of the key results – it is very informative. It deals with the support offered by seven of the 31 companies surveyed. Eight companies responded to the survey; one of the eight declined to complete the survey; and seven provided the data used for the analysis of the results. As a result, the survey was not very representative of what is actually performed worldwide, but it does indicate a few obvious points.
One of them, obviously, is that it was not something that many manufacturers wanted to complete. In fact, more than 70% declined to respond to questions about the traceability of the temperature calibrations performed. It was also most interesting to note that of the responding organizations, the majority were organizations that are also makers of spot or infrared thermometers. Some are companies that also offer calibration equipment and training for those wishing to learn how to actually perform calibrations of spot IR thermometers.
Two conclusions are easily drawn from these results.
First, however, one must realize that there are neither international nor US standards for temperature calibration of these devices. The survey showed what this situation can produce.
The first conclusion is that the equipment makers do not agree with each other or provide the same type or level of information about temperature calibration and verification of the devices they sell and provide. Secondly, they provide even less support to the user in understanding how they, the user, can determine – not readily determine, just roughly check if the calibration of their device is within specification tolerances or not.
Part of this anomaly is, in my opinion, due to the fact that device specifications and test methods are vague and sometimes misleading. Lacking national or international standards, even for the terminology used to describe key measurement parameters, is not a big surprise. At least it is now partially documented.
How, then, is the user able to judge some very simple facts about the temperature readings they obtain using their imagers?
Let’s back up to something even more fundamental. How is a user able to know if his equipment is functioning to within the expected capability when he first receives it, brand new, right from the supplier?
If we are able to provide some simple test methods to do that, then the user has a hope of determining a few other important facts about the performance of his device, like:
1. When he brings it back from a use in the field, is it still responding in the same manner as when he first received it?
2. What sort of measurement tolerance should a user expect a device to provide in use? Is the indicated temperature reading within ±1 or 2°F or really more like ±10-15 or even 20°F?
Yes, I know, there are many application complications to unraveling the true temperature in field measurements. There are just as many, if not more, in evaluating the basic calibration uncertainty of a measurement device. Both are a combination of art & science.
The goal of this talk is to review some of the options that users have in verifying the calibration of their quantitative thermal imager when they first receive it and later as they use it.
Discussion
Demythtifying Calibration Verification
Before we get into how to do it, let’s look at some of the whys, or more precisely, consider the objections to doing calibration checks. In other words, let’s dispel a few popular myths.
Some common myths about thermal imager temperature calibration:
Myth 1: If the manufacturer wanted me to do it, they would tell me how and probably why. Since they don’t, I don’t need to do it.
That is patently false. Some manufacturers do recommend that you check and some don’t. Some tell you how and even offer training courses so you can do it correctly. Unfortunately, most do not. Most simply recommend that after some period of use, the devices be returned to them for a check and possible re-calibration.
In my experience, I have never ordered a measurement device of any type that could not be verified or shown to have its calibration traceable to a national or international measurement standard. The issue of calibration has become the most significant change to occur in scientific and industrial measurements over the last 25 years. It is the starting point for assured measurement quality. To quote the late Nicholas and Rod White, authors of the exceptional book Traceable Temperatures2, “Traceability is not something you sort out after a measurement, you start with traceability at the outset”.
Myth 2: I don’t want to know or have to know if it’s out of calibration.
That’s when you need it or your head fixed!
How can you use a piece of equipment if you are uncertain whether or not it’s working correctly, especially if your livelihood depends upon the measurements you take? Better yet, how could you defend its measurement results in a lawsuit, if you cannot show a contiguous record of traceable calibration verifications?
Q: Mr./Ms. Thermographer, can you demonstrate to the Court that the results you claim are true and the abnormally high temperatures that you reported to your Client, the Defendant in this lawsuit, was true, traceable and accurately measured and high enough to be sufficient cause to have the entire building evacuated and the many companies within the building to suffer the significant monetary loss they are suing him and, by implication you, to recover? Please produce your contiguous records of calibration stability.
A: Huh?
Reality is, you need to not only know your equipment works correctly when you first get it, but that it remains within specification or responds the same every time that you use it. To be unsure is a recipe for disaster. One of the “rules” for a working thermographer is, or should be: Be sure your tools are working properly.
Myth 3: I only measure temperature differences in a scene, and do not need accurate calibration to see large ones.
This is the biggest and falsest myth of all. Table 1 lists the approximate systematic errors that result from a 1% error or change in sensitivity of a radiometric temperature measurement for three different waveband instruments, where the center wavelength for each band is approximately listed and indicated by the more common terms used to describe the generic type of thermal imager wavebands. These terms are: near infrared (NIR), mid-waveband infrared (MW), and long waveband infrared (LW). For each waveband, note the differential temperature sensitivity and how it changes for the same radiometric error.
Part of this myth is the variety of answers you’ll get when you ask: How big a difference do you need to measure and to what precision? Much of the answer is tied directly into the expected region of temperatures that you wish to measure and the effect you are documenting.
Myth 4: Even if my unit is out of calibration, I do not know how to fix or adjust it.
You can learn how, or even better, how to find someone who already knows how and will do it for you and possibly teach you simple steps you can use to periodically verify on a regular basis and especially, how to adjust for any errors when performing field measurements.
Calibration is neither brain surgery nor rocket science, despite what some may want you to believe. You can learn simple steps to take to confirm the instrument is accurate when you first receive it and further steps to take to ensure it remains stable to within desired tolerances.
You don’t have to do the work yourself either, if you would rather not or don’t have the time. The simple step of either requiring your vendor to do the job for you, or finding an independent third party vendor who will, for a moderate fee, either do it or provide you the equipment and know-how so you can do it yourself.
In the worst scenario (in a pinch), if you know by testing that your instrument is out of calibration by a certain amount, you are often, but not always, able to correct for that error by adjusting a reading by the correct amount. Knowing how and when to apply such a correction takes a little more thought and can depend greatly on the difference in conditions between the calibration check and a field measurement. It is a correction to consider carefully, but it can be made most times.
What is Calibration and How is it Related to Verification?
Strictly speaking, calibration is any act that involves comparing the measured result with a traceable reference value. Most people consider calibration to be a series of such comparisons over the entire measurement range or span of the thermal imager. Many calibration laboratories report their results in terms of a number of such points, perhaps as few as three (low, midpoint and high) points across that span. Other services can be as frequent as every 50 or 100 degrees within the span.
Often, in the act of performing such a series of calibrations, a manufacturer or calibration service will adjust the effective zero and gain of an electronic instrument to minimize the errors found and actually perform a set of calibration corrections to bring the device within advertised tolerances, or minimize the errors across the entire span or a portion of that span as per the owner’s request. In the latter case, they will report the calibration results in terms of both the “As Found” and “As Left” conditions. Some will merely report the deviation of the device’s response from the reference values.
In order to perform such a calibration, with or without adjustment for a thermal imager, a high quality blackbody simulator, an adequate, traceable reference temperature source, and either an embedded temperature sensor in the blackbody or a high quality reference spot infrared thermometer are needed. In addition, a well-documented and practiced procedure performed by a trained and experienced calibration specialist is essential to doing the job correctly. It is not for the newbie!
What constitutes an adequate reference temperature sensor? Simple: it, as a minimum, must have a certified and traceable uncertainty that is four to ten times better than the desired uncertainty of the device under test. If the thermal imager is stated to be precise to within ±2°C, then the reference device needs to have an uncertainty of at least ±0.2 to 0.5°C in the region of comparison.
How You Can Verify Calibration
Do you have to become a calibration specialist to check or verify the calibration of your imager? Of course not. You must, however, develop a method that works for you, especially when performing routine calibration stability verifications of your instrument.
First, you need a method to verify that the actual calibration is correct when you first receive your imager or whenever you have reason to suspect it has changed, say after it has been dropped or otherwise seriously stressed.
This is about the only point at which an investment in either expensive calibration equipment and training would be required. It is a lot easier and less expensive to pay for a third party to perform the test for you. Plus, most users need only test at two or three temperatures to assure themselves that the imager meets specifications. Some manufacturers will also offer an optional calibration certification that can serve the same function, but you still need to ensure the imager is functioning when you get it and perform the first few calibration stability checks yourself.
Second, you need a simple check method to routinely test it, either on a regular timetable, or possibly every time before and after taking it into the field, to be sure it is within your needed tolerances. This is not a high precision test, but a highly reproducible one.
It will require at least one traceable, high resolution temperature sensor, like an ASTM thermometer or perhaps a certified thermocouple, resistance thermometer or thermistor with certified readout device (possibly purchased as a small system). Suitable devices can be bought from a wide range of suppliers for less than $200 – $400.
It will also require either a simple blackbody simulator or a home-brewed source of thermal radiation that can be easily and repeatably brought to a stable temperature. Temperature stability is essential. Such a device could be a glass or plastic container or beaker with a mixture of water and ice.
We are in the process of publishing some openly available, free examples of methods to use on our website, TempSensor.net. Check on it regularly to see what ideas are working. We are also soliciting examples from the user community at large since we think there are many out there with some clear and workable methods. Sharing works!
It could also be a similar container mounted on a low cost stirring laboratory hot plate. The sensor is immersed in the water or other fluid and brought to a desired temperature. The stability of the test temperature is determined mostly by the stirring action of the liquid and its thermal mass. One does not have to reproduce a test temperature exactly, merely attain a temperature that is within a few degrees of the check point. Any calibration change in a given region will be relatively insensitive to the exact temperature of the test with a few degrees.
As long as the test target is the same material and size and the distance to the imager is held constant for each check, one should have very reproducible test conditions.
The frequency of checks is up to you. But always start with about a three to four times shorter interval than you think you need. If no changes appear after three or four tests then you’ll feel a lot more comfortable in extending the time between tests. Your optimum time will show up as you develop a database of results.
Third, you need to keep records. Develop a database of your check results to develop a statistical basis for making decisions, like when to send it back for repair or calibration adjustment and when to shorten or lengthen the time between checks.
Hey, your gadget may be super stable and only need a return to the factory once every four or six years. Maybe it’s not so super and needs to get factory service every three or four months. That can be greatly affected by how you use your imager. Your testing will tell you about such effects or trend on your calibration over time.
You’ll learn more about the stability of your device(s) by being consistent in checking the calibration at regular intervals and under as near to identical conditions as you can arrange. The more precise you are in repeating such simple things as warm-up time, ambient conditions, (temperature, relative humidity, vibration, etc.), distance from device to target, focus setting, lens used, emissivity setting, time constant used, and so on, the more any measured differences over time will be attributable to the instrument(s) you test.
How do you do it, in practical terms? A lot depends on your temperature range of interest. A check at one, two or even three different temperature points may be needed; perhaps only one will do for the way you use your instruments. My motto is “the more the merrier”, but practicality has to rule. The minimum you need is probably about two to three times what you initially think you need.
Document your methods; write a procedure and submit it to your equipment supplier and ask their critique. Assure them that you are not trying to fault their device, but rather to develop a database showing just how stable it is.
Bottom Line
Your multi-thousand dollar temperature measuring (quantitative) thermal imager calibration could be really great or not. The only way you can be sure, is to regularly check it. Some think routine calibration checking, or verifying, is a task for specialists. You need to become, or appoint someone in your organization to be, that specialist and begin monitoring each of your devices and not relent as long as you own them. Anything less is asking for trouble of many kinds.
It is the right thing to do!
References
1 G. R. Peacock, “Temperature Uncertainty of IR Thermal Imager Calibration”, Proceedings of SPIE, Volume 6205 (ThermoSense XXVIII), [620509-1 to 620509 8], April 2006 (ISBN 0-8194-6261-6)
2 Nicholas, J.V. and D. R. White, “Traceable Temperatures”, 2nd Edition, John Wiley & Sons, Inc., Chichester 2001 (ISBN 0-471-49291-4)
3 “Theory & Practice of Radiation Thermometry”, Edited by DP Dewitt & Gene D. Nutter, John Wiley & Sons Inc., New York, 1988 (ISBN 0-471-61018-6)
Advertisement