It's all relative

How do you tell if a pan of water is hot? If it's near the boiling temperature, you might see small air bubbles appearing on the sides of the pan and know from experience that it's too hot to touch. But, if it's only ten or twenty degrees above your body temperature, you won't see bubbles. You could put your finger in it and your sense of touch would tell you that it's hotter than your finger, without causing blisters.

The problem is that to look at it from a distance, that same pan of water might be the result of recent melting of a pan of ice. Your finger would now tell you it was cold, but your eye can't tell the difference. So, how do you measure the temperature of something that might hurt you if you touch it?

The background

In 1714, Gabriel Fahrenheit invented the mercury thermometer, consisting of a column of mercury in a narrow tube of glass with a reservoir bulb at one end. The difference between the thermal expansion rate of the mercury and the glass results in the mercury expanding faster than the glass, and that difference can be measured and calibrated to indicate temperature.

A century later, Thomas Seebeck discovered that a temperature difference between the junctions of two dissimilar metals resulted in an electric current flow in the loop. This became the basis for thermocouples, which are used today to measure temperature.

Both of these methods rely on the flow of heat from the measured surface to the sensor through direct contact. That flow of heat takes time, so you may have to wait to get a good reading. And, you may actually change the temperature of the object you're measuring.

In our water experiment, we might measure the temperature of one of the thermocouple junctions using our mercury in glass thermometer and then, placing the other junction in the water, measure the voltage appearing in the open thermocouple loop between the junctions. We could then determine the temperature of the water by adding the difference temperature between the two junctions, to that read by the mercury thermometer for the reference junction. We do exactly that when we use a Fluke DMM with temperature measurement capability, except that we use an electronic method to determine the reference junction temperature of the meter.

Non-contact temperature measurement

Can we measure the temperature of the water's surface without touching it? The answer lies in the works of Joseph Fourier, who described infrared radiation as "dark heat" in the early 1800s, and Gustav Kirchhoff, who published his law of thermal radiation in 1859. Kirchhoff's work discussed some rules about objects absorbing and emitting thermal energy. The concepts are informally stated as, "Poor emitters are good reflectors and good emitters are poor reflectors." Non-contact IR thermometers use an optical system and special detector to sense infrared radiation from a surface, and then translate what they read to temperature. Since not all surfaces emit infrared radiation equally, when we use infrared radiation as the basis for our temperature measurement, we must account for the emissivity of the measured surface.

We can use a recently introduced Fluke 566 or 568 Infrared (IR) Thermometer to conduct some experiments to better understand what's going on. These meters are handy because they can measure temperature by either IR (non-contact) or thermocouple (contact) methods.

Kitchen science

The balance of this column will be my crude attempt to show how we can use one of these meters to test the concepts. The caution here is that we need to have a significant difference between the ambient temperature of the thermometer and the object we're measuring for best effect. That's because there are compensating circuits to account for the ambient temperature of the meter that might make the results of the following tests less sensitive.

For my heat source I heated a pan of water on the kitchen range. Then, using a technique that I first observed my mother using years ago, I re-created a variation of what she called a double boiler, that is, I floated a smaller aluminum pan in the boiling water. In the bottom of the smaller pan I placed the thermocouple bead for my surface contact measurement and a small black velvet pad as a good emitter for my IR measurements. I turned the heat down and conducted my measurements on the slowly cooling objects.

At the default emissivity setting (0.95) for the Fluke 566 I was using, I got amazing correlation between the two readings, 152.0° F (IR) vs. 152.5° F (thermocouple).

I then removed the black pad and read the shiny surface of the pan bottom with the IR thermometer. The IR reading dropped dramatically. Adjusting the meter's emissivity down to 0.10 raised the reading, but even then, the IR reading was 70 degrees lower than the thermocouple reading. Why? It's because I was actually measuring the temperature of the nearby wall reflected in the pan.

Had the surface of the pan had a dull finish, you could get reasonable results by using the emissivity setting for oxidized (non-shiny) aluminum provided in the meter, because you would be reading the emitted energy, not the reflected energy from another surface.

So, here's another caution when using IR. If you're not sure about what emissivity setting to use, then just do as I did. Be careful to avoid reflective surfaces, or place a patch of cloth tape or flat black paint on your target. Use the thermocouple to verify the setting for the IR meter. Then, you can proceed with the faster non-contact measurements with some confidence.