Skip to main content

Article Archive

Calibration of Radiometric Equipment

Date: April 01, 2019

Michael R. Sharlon, President

Level III Certified Infrared Thermographer
Professional Physical/Mechanical/Electrical
Electronic Senior Metrologist

Thermasearch, Inc.
43 High Point Drive
Mayflower, AR 72106
Ph: 501-514-0953
This email address is being protected from spambots. You need JavaScript enabled to view it.
 

Abstract

Imaging and non-imaging infrared radiometers are frequently used to determine the temperature of objects. Obtaining accurate temperature values requires that the instrument be properly used and is within calibration. This presentation will discuss the topic of calibration of infrared radiometers. Also covered will be calibration methods along with simple techniques that thermographers may use to determine if their infrared equipment is performing within measurement specifications.

Introduction: History of Calibration

A few thousand years ago, Egyptians developed a system of traceable measurement, now known as Metrology. In the days of the Pharaohs and kings, a carved granite block measured from a Pharaohs forearm to the tip his index finger plus the width of his palm was made and became the reference for length (the Royal Cubit Master). Architects and craftsmen in the building of the Pyramids, tombs, and temples then compared to this reference. The result of this measure and the traceability to the Royal Cubit Master mandated is still one of today’s wonders. The Pyramids were built to a total tolerance (loosely translated, error) of +/- 0.05 %. That means that for every 125 feet, the Egyptian builders were off by less than an inch! One might note that if the holders of the Royal Cubit Master copies did not bring in their copy weekly for comparison to the master, the penalty was death.

Calibration of Infrared Radiometers

What is Calibration of an Instrument and What Does it Accomplish?

Calibration of infrared radiometers is the act of comparing that instrument’s fundamental unit(s) of measure (that it derives) with another instrument. This comparison instrument is capable of an even more accurate reading of the same measured stimulus and that has itself been compared to an even more accurate instrument. This chain of ever tightening comparisons is tied to a national or international source. In the U.S., this source would be the National Institute of Standards and Technology (NIST).

NIST uses natural phenomena in physics to establish the most finite or repeatable unit(s) of measure. In the measure of volt, amperage, and resistance, the reference cell (providing an exact voltage) is one workhorse. In temperature, more finite measurement of temperature incorporates the use of Standard 100 ohm .0385 Platinum Resistance Detectors (SPRTD or SRTD) in conjunction with the freezing point of certain metals or the triple point of water (a phenomena in nature that few elements or compounds share, the temperature/pressure point at which distilled water is solid/liquid/gas). For radiant energy measurements an environmental enclosure will be added to a contact temperature detector embedded in a nearly perfect radiator material. These sources are referred to as primary standards. A primary standard is considered as accurate a measure as you are going to obtain.

When accuracies are combined to form a total uncertainty for your instrument, the result is used to measure a stimulus with a degree of certainty in your measurement. This provides you with an opportunity to obtain a measurement point capable of being repeated (when taken under similar conditions) at another time and anywhere on the planet. This further establishes a basis for an expected occurrence, i.e., ice cubes will begin to thaw at or around 32 degrees F and higher at sea level.

When discussing calibration of any instrument, including calibration of infrared radiometers, the terms accuracy, tolerance and uncertainty are often interchanged and assumed to be similar. They are not. These terms can become very detailed in explanation and simply said, are subjects within themselves. For the purpose of this presentation, accuracy is a statement of possible limits of error for a given parameter of an instrument under specific conditions. The total errors for just that instrument would be the tolerance of the instrument. Most measuring instruments will have several errors capable of affecting the displayed measurement. When these errors are combined, first with each other (those associated directly with one instrument, tolerance) and then with other total errors of additional instrument(s) used in conjunction with providing a displayed value (i.e., an instrument loop), the result is a total uncertainty of the viewed reading.

Why Calibrate?

Perhaps an example might be in order. Let’s say the recommended air pressure in a name brand tire is found to cause rollovers in a model of vehicle. A subsequent bulletin informs users of this pressure and the new recommended pressure. At this point, what provides the user with the assurance that the tire pressure read by a device is in fact that pressure or that the tested pressure was in fact the reported pressure? Taken one step further, what assures the same user when the pressure is read on another continent with a different instrument? The answer is a science known as Metrology; the calibration and records system required to provide assurance of a traceably accurate (to NIST or similar) instrument.

Another example might be the chair you are sitting in. Odds are that the parts used to build, cover and package the chair were manufactured at more than one site. How then did the back fit so well into the seat, the paint or chrome adhere to the metal in an even process, or the print on the corrugated packaging become so evenly applied on what must have been quite a few containers? The answer lies in repeatably accurate measurements from instruments sharing a similar chain of measured accuracy.
Now then, why calibrate your radiometry? The answer to this question lies in your use of the equipment and being right about your viewed possible exceptions.

Thermographers provide at a minimum:

  • Specific IR related temperatures
  • Delta IR related temperatures
  • Thermal and visual images of suspected findings
  • Comparative IR related data

As a thermographer, you will provide the most correct thermally related information. Accurate temperature values derived from your IR equipment can ONLY be accomplished with calibrated instruments. Uncalibrated instruments may provide an indication within the displayed view that an object is hot or cold. How hot an object is (in the display) cannot be proven. Proving your reported temperature value is a fundamental reason for calibration of infrared radiometers. If your instrument is calibrated, and your customer (or their attorney) asks: “Was the object in question as hot or cold as you reported?” The simple answer is yes. The more complex answer would include a degree of tolerance (possible error).

Still not convinced? Try telling your customer that they have a 45 C degree delta in a bolted part of a bus carrying a few thousand amps. Then find out that they scheduled a tear down of your reported system with contracted electricians who report no problem found.

Accuracy of Your Imagery and Calibration Standard

The basic statement of accuracy for your imager will not (typically) include the compiled inaccuracies of your equipment. It will typically state the viewed temperature will be +/- some value (often 2% of full scale reading) up to a set temperature and some greater value above this temperature. Your calibrating instrument(s) accuracy should be at least 1/3 to 1/4 of your instrument’s accuracy (i.e., for a 2% instrument, a standard must be at 0.67 to 0.5% of full-scale reading). As an example, consider:

Most imagers’ accuracy is +/- 2 deg C to 100 deg C or +/- 2% of the reading, whichever is greatest. With this stated value, at say 250 degrees C, your instrument may actually be reading 245 to 255 deg C (+/- 5 deg C).

When working with radiant temperature readings, this kind of accuracy (i.e., 0.5%) is difficult to obtain. The reason is that Emissivity/Absorbtance (E/A), Reflectivity(R) and Transmittance (T) tend to induce errors at or greater than 2% of the displayed value. For this reason, primary radiant energy standards are usually laboratory blackbodies. These standards are designed to greatly reduce ERT errors. R and T in these standards will be negligible and E will be within fractions of a percent of the given for the blackbody. These standards temperatures are typically less than +/- 0.5 % of the indicated value. It is important to note that these standards often derive their temperature value and oven / refrigeration control from imbedded thermometry and because of this can be calibrated to an even tighter tolerance, if required.

Calibration Certification

Certification of calibration of infrared radiometers is intended to provide a quality and legal tool for a manufacturer, user, or customer’s assurance of a physical values actuality. In its simplest format, the certification is a document intended to prove traceable and repeatable accuracy and identity statements for instrumentation, instrumentation holder(s) and the performers of the calibration. Calibration requiring adjustment(s) can only be performed by your equipment’s manufacturer. Your IR equipment can be verified as still in calibration (for a specific span), from home or office.

Calibration Standard for Infrared Imagery

If you send your equipment in for calibration, odds are that a blackbody will be used. These primary standards range in cost from used at approximately $500.00 to new at $1,500.00 and up, plus the cost of traceable certificates.

User Verification of their Instrument’s Calibration Status

Some common techniques used in verifying that your instrument is still in cal is to bring distilled water to a freezing slush or just to a boil and observe the temperature of the vessel with your radiometry. To do so requires:

  1. Assurance that your vessel is large enough to ensure that your radiometry’s spot size is less than the observed surface.
  2. That the vessel surface observed by your radiometry is flat and perpendicular to the observed plane.
  3. That you can reasonably determine the vessel surface E (i.e., painting it with a high temperature paint) and exercising your Level II skills.
  4. That lighting and other heat sources are minimally (R and T) impacting.
  5. That you use contact traceable thermometry to ensure the expected reading is at or close to approx. 0 or 100 degrees C. Calibrated (and traceable) thermometry is not that expensive. New and used equipment may be obtained for under a hundred dollars and calibrated for $50.00 and up (typically and annually).

Note 1: When heating your vessel, apply heat such that there is a minimal chance of convected heat rising along the vessel surface, between the vessel and the imager.

Note 2: Reference 5 provides a simple table depicting temperature correction for a change in altitude/barometer (pressure). Freezing and boiling water correlating with 0 and 100 degrees C occurs at sea level (29.92 inches of mercury).

Another technique requires comparing your instrument to a calibrated blackbody. Maintain your records much as you do when maintaining your reports to your clients.

You will need to exercise repeatability between obtaining your annual readings so consider the where, when and how while performing your calibration.

A sample of a suggested documenting format is attached as Enclosure 1. This sample provides compliance with ISO 17025 and ANSI Z540 guidelines.

How Often Should the Instrument be Calibrated or Verified as in Calibration?

Typical IR instruments are calibrated or verified as, “in cal” annually. Simple, single point validation of your radiometry is often performed prior to performing a quantitative scan. Validation is a self-assurance concept and typically does not require the time consumed in calibration or multipoint cal verification. One of the simplest “slap on the door” validators can be calibrated, and used daily. It costs about $20.00 (uncalibrated). It is Cole-Parmers or Oaktons CaliMat (see Enclosure 2).

Cost to Calibrate

The cost for calibration of infrared radiometers by using a third party like a calibrations lab or manufacturers calibration lab is broad at best (typically $250 to $1,500). Manufacturers will be the only source for correction of radiometry found out of calibration. This is due to the use of proprietary software in the radiometry. Calibrations labs outside of the manufacturer of your radiometry are only able to provide traceable proof that your radiometry is within manufacturers or your tolerances and remains in cal.

If you intend to spend the extra dollars to obtain your own blackbody (highly recommended if you practice Level II thermography), some sources have been listed in Enclosure 2.

Traceability

The most comprehensive description of traceability and an excellent source for key terms in calibration are provided by NIST at https://www.nist.gov/fusion- search?s=traceability&op=

References

1) ANSI/NCSL Z540-2-1997 American National Standard for Expressing Uncertainty, US Guide to the Expression of Uncertainty in Measurement

2) NCSL RP-1 January 1996 National Conference of Standards Laboratories Recommended Practices for Establishment and Adjustment of Calibration Intervals

3) NCSL RP-3 January 11, 1990 National Conference of Standards Laboratories Recommended Practices for Calibration Procedures for Measuring & Test Equipment, Measurement Standards, and Measurement/Test Systems

4) Website to the National Institute of Standards and Technology: https://www.nist.gov

5) Infraspection Institute Level II Certified Infrared Thermographer Reference Manual

Calibration of Radiometric Equipment Enclosure 1


VERIFICATION OF CALIBRATION


Calibrated by:
Company Name:
Company Street Address:
City, State, Zip:
Phone:
Where Calibrated:


Equipment Information:
Manufacturer:
Model Number:
Serial Number:
Description:
Certification Number:
Calibration Date:
Calibration Frequency:
Calibration Due Date:


Calibration Status and Conditions:
Received:
Returned:
Distance to Targe:
Room Temp. / Background:
Room Humidity:
Emissivity:
Lens/Filter:
Procedure Used: Comparative Analysis


Standard(s) Used for Calibration:
Manufacturer:
Model:
Call Date:
Due Date:
Traceability:


Calibration Data:
Function:
Range:
Units:
Nominal:
Std Reading:
IUT Reading:
As Left:
Tolerance:
Comments:


Measurements Preformed By:
Approved By:
Date of Issue:

Calibration of Radiometric Equipment Enclosure 1


VERIFICATION OF CALIBRATION


Instructions for Completion of Form

Calibrated by will provide the location information of the company performing the calibration. This might be you, your subsidiary or a third party (Agent).

Where Calibrated will state just that.

Equipment Information will include all data making the instrument calibrated unique. The calibrator will insert the certification number. This number is typically a unique number providing historic traceability of this certification. It may be as simple as a unique file location number. It is your choice if you are performing the calibration.

Calibration Status and Conditions is designed to assist in future calibration repeatability and ensure environmental basics were observed and within those typically presented by the calibrated instrument’s manufacturer. If the required parameter is not available with the model calibrated, simply mark N/A (i.e., lens/optics for a simple spot radiometer).

Standard(s) Used for Calibration will provide basic data needed to provide the traceable path of calibration to a national/international source. It is important to place the certification number for the most current (in cal) certificate held for this standard.

Calibration Data includes:
Function – typically the measured basic parameter, i.e., temperature

Range – the range of the device at this specific reading. It may be simply a 1, 2, 3 or depicted as 0 to 200, using just the digits of the range.

Units – sets the unit identification for all the readings given on this line, i.e., deg C

Nominal – digits expressing the anticipated reading (i.e., 100)

Standard Reading – what value was obtained when attempted nominal is reached (i.e. 100).

IUT Reading – the Instrument Under Test reading when the standards’ reading is taken (i.e., 102).

As Left – if equipment is not adjusted, this will typically be the same as IUT reading.

Tolerance – will typically be the stated accuracy given by the manufacturer. A caution – tolerance and accuracy are not the same but generally are interchangeable for general Predictive/Preventative Maintenance thermography.

Comments – typically states in cal, out of cal, within mfgs specs, etc.

Signatures are two individuals for quality assurance and witnessing reasons. The approver is typically administrative (ensures the document is filled out properly) and a passive observer of all or parts of the calibration process.

Calibration of Radiometric Equipment Enclosure 2


Blackbody Calibrators


Omega: https://www.omega.com/pptst/BB701.html (or replace BB701 with BB703 or BB704 or BB705).

HART: https://www.alphacontrols.com/Hart_Handheld_infrared_Calibrator.htm

CaliMat: 26 ̊F – 56 ̊F www.coleparmer.ca (800-363-5900)

CaliMat: 58 ̊F – 88 ̊F Oakton part no. 35625-55 (available on Amazon.com and Davis.com)

Korean: http://www.korins.com/m/ca/

Williamson IR: https://www.williamsonir.com

Isotech North America: http://www.isotechna.com/Menu.asp?Param1=1&Param2=Infrared

Advertisement

Latest Articles

Pay Attention or Pay a Price

July 01, 2024

Planned Maintenance as a Safety Requirement

January 31, 2024

How to Use Reliability to Offset Supply Chain Issues

May 01, 2023

Basic Inspection Tools Are Vital to Improve Your Condition Monitoring Process

January 01, 2023

IR Inspection Windows – See What You’ve Been Missing

July 01, 2022