Cookies on the OMEGA websites
We use cookies on this website, these cookies are essential for the website to work correctly.If you continue without changing your settings, we'll assume that you are happy to receive all cookies on this website.To find out more information about these cookies please click here.
CLOSE
Basket  |  Contact  |  Help  | 
Free Phone 0800 488 488
International+44(0) 161 777 6611

Infared thermography - A Historical Perspective

Our eyes only see the tiny fraction of energy emitted by the sun in the form of visible light. However, if we could see the infrared rays emitted by all bodies--organic and inorganic--we could effectively see in the dark. Though invisible to the human eye, infrared radiation can be detected as a feeling of warmth on the skin, and even objects that are colder than ambient temperature radiate infrared energy. Some animals such as rattlesnakes, have small infrared temperature sensors located under each eye which can sense the amount of heat being given off by a body. These sensors help them to locate prey and protect themselves from predators.

Non-contact temperature sensors use the concept of infrared radiant energy to measure the temperature of objects from a distance. After determining the wavelength of the energy being emitted by an object, the sensor can use integrated equations that take into account the body's material and surface qualities to determine its temperature. In this chapter, we will focus on the history of radiation thermometry and the development of non-contact temperature sensors.

IR Through the Ages

Although not apparent, radiation thermometry has been practiced for thousands of years. The first practical infrared thermometer was the human eye (Figure 1-1). The human eye contains a lens which focuses emitted radiation onto the retina. The retina is stimulated by the radiation and sends a signal to the brain, which serves as the indicator of the radiation. If properly calibrated based on experience, the brain can convert this signal to a measure of temperature.



People have been using infrared heat to practical advantage for thousands of years. There is proof from clay tablets and pottery dating back thousands of years that the sun was used to increase the temperature of materials in order to produce molds for construction. Pyramids were built from approximately 2700-2200 B.C. of sun-dried bricks. The Egyptians also made metal tools such as saws, cutting tools, and wedges, which were crafted by the experienced craftsmen of their time. The craftsmen had to know how hot to make the metal before they could form it. This was most likely performed based on experience of the color of the iron.

Because fuel for firing was scarce, builders of Biblical times had to depend on the sun's infrared radiation to dry the bricks for their temples and pyramids. The Mesopotamian remains of the Tower of Babel indicate that it was made of sun-dried brick, faced with burnt brick and stone. In India, a sewer system dating back to 2500 B.C. carried wastewater through pottery pipes into covered brick drains along the street and discharged from these into brick culverts leading into a stream.

In ancient Greece, as far back as 2100 B.C., Minoan artisans produced things such as vases, statues, textiles. By using sight, they could approximate when a piece of material could be shaped. Terra-cotta pipes were built by heating them to a certain temperature and casting them into a mold.

In more recent years, special craftsmen have relied on their own senses to visualize when a material is the correct temperature for molding or cutting. Sight has been used for steel working, glass working, wax molding, and pottery. From experience, skilled craftsmen learned to estimate the degree of heat required in the kiln, smelter, or glass furnace by the color of the interior of the heating chamber. Just as a classical blacksmith, for example, might judge the malleability of a horseshoe by its cherry-red color.

In countries around the world, the technique of sight is still being used. In Europe, glass molding craftsmen use sight to determine when glass is ready to be shaped (Figure 1-2). They put a large piece of glass in a heating furnace by use of a large metal rod. When the glass reaches the desired color and brightness, they pull it out of the oven and immediately form it into the shape they want. If the glass cools and loses the desired color or brightness, they put it back in the oven or dispose of it. The glass makers know when the glass is ready, by sight. If you have a chandelier made of glass, or hand-made glasses from Europe, most likely they were formed in this way.

From Newton to Einstein

The thermometer was invented in Italy by Galileo Galilei (1564-1642), about two hundred years before the infrared light itself was discovered in 1800, and about 100 years before the great English scientist Sir Isaac Newton (1642-1727) investigated the nature of light by experimentation with prisms. As published in Opticks in 1704, Newton used glass prisms to show that white light could be split up into a range of colors (Figure 1-3). The least bent portion of the light consisted of red, and then following in order, orange, yellow, green, blue, indigo, and violet, each merging gradually into the next. Newton also show that the different colors could be fed back through another prism to produce white light again. Newton's work made it clear that color was an inherent property of light and that white light was a mixture of different colors. Matter affected color only by absorbing some kinds of light and transmitting or reflecting others.



It was also Newton who, in 1675, proposed that light was made up of small particles, or "corpuscles." With this theory, Newton set out to measure the relative sizes of these corpuscles. From observations of the eclipses of the moons of Jupiter, Newton realized that all light traveled at the same speed. Based on this observation, Newton determined the relative sizes of the different color light particles by the refraction angles.

In 1678, Christiaan Huygens (1629-1695), a mathematician, astronomer, and natural scientist, challenged Newton's "corpuscular" theory proposing that light could be better understood as consisting of waves. Through the 1800s, the theory was well accepted, and it eventually became important in James Clerk Maxwell's theory of electromagnetic radiation.

Ironically for the field of infrared thermometry, infrared radiation was first discovered by using a conventional thermometer. Friedrick William Herschel (1738-1822), a scientist and astronomer, is known as the father of sidereal astronomy. He studied the planets and was the first scientist to fully describe the Milky Way galaxy. He also contributed to the study of the solar system and the nature of solar radiation. In 1800, England, he was experimenting with sunlight. While using colored glasses to look at the Sun, Herschel noticed that the sensation of heat was not correlated to visible light (Figure 1-4). This led him to make experiments using mercury thermometers and glass prisms and to correctly hypothesize the existence of the invisible infrared heat waves. Until Herschel, no one had thought to put a thermometer and a prism together to try to measure the amount of heat in each color.

In 1800, Herschel had formed a sunlight spectrum and tested different parts of it with a thermometer to see if some colors delivered more heat than others. He found that the temperature rose as he moved toward the red end of the spectrum, and it seemed sensible to move the thermometer just past the red end in order to watch the heating effect disappear. It did not. Instead, the temperature rose higher than ever at a spot beyond the red end of the spectrum (Figure 1-4). The region was called infrared, which means "below the red."



How to interpret the region was not readily apparent. The first impression was that the sun delivered heat rays as well as light rays and that heat rays refracted to a lesser extent than light rays. A half-century passed before it was established that infrared radiation had all the properties of light waves except that it didn't affect the retina of the eye in such a way as to produce a sensation of light.



The German physicist Joseph von Fraunhofer (1787-1826) investigated the solar spectrum in the early 1800s. His spectroscope introduced parallel rays of white light by passing sunlight through a slit. The light contacted a prism, where the prism broke the light into its constituent rays. He produced an innumerable amount of lines, each an image of the slit and each containing a very narrow band of wavelengths. Some wavelengths were missing however. The slit images at those wavelengths were dark. The result was that the solar spectrum was crossed by dark lines. These lines would later become important to the study of emission and radiation.

In 1864, James Clerk Maxwell (1831-1879) brought forth for the first time the equations which comprise the basic laws of electromagnetism. They show how an electric charge radiates waves through space at various definite frequencies that determine the charge's place in the electromagnetic spectrum--now understood to include radio waves, microwaves, infrared waves, ultraviolet waves, X-rays, and gamma rays.

In addition, Maxwell's equations' most profound consequence was a theoretical derivation of the speed of electricity--300,000 km/sec.--extremely close to the experimentally derived speed of light. Maxwell observed and wrote, "The velocity is so nearly that of light, that it seems we have strong reason to conclude that light itself...is an electromagnetic disturbance in the form of waves propagated through the electromagnetic field according to electromagnetic laws." Maxwell was able to predict the entire electromagnetic spectrum.

Another German, physiologist and physicist Hermann von Helmholtz (1821-1894), accepted Maxwell's theory of electromagnetism, recognizing that the implication was a particle theory of electrical phenomena. "If we accept the hypothesis that the elementary substances [elements] are composed of atoms," stated Helmholtz in 1881, "we cannot avoid concluding that electricity, also, positive as well as negative, is divided into elementary portions which behave like atoms of electricity."

Gustav Robert Kirchhoff (1824-1887), a physicist and mathematician, worked with Robert Bunsen (1811-1899), an inorganic chemist and a physicist, in 1859 on a spectrometer that contained more than one prism. The spectroscope permitted greater separation of the spectral lines than could be obtained by Fraunhofer's spectroscope. They were able to prove that each chemical element emits a characteristic spectrum of light that can be viewed, recorded, and measured. The realization that bright lines in the emission spectra of the elements exactly coincided in wavelength with the dark lines in the solar spectrum indicated that the same elements that were emitting light on earth were absorbing light in the sun. As a consequence of this work, in 1859, Kirchhoff developed a general theory of emission and radiation known as Kirchhoff's law. Simply put, it states that a substance's capacity to emit light is equivalent to its ability to absorb it at the same temperature.

The following year, Kirchhoff, set forth the concept of a blackbody. This was one of the results of Kirchhoff's law of radiation. A blackbody is defined as any object that absorbs all frequencies of radiation when heated and then gives off all frequencies when cooled. This development was fundamental to the development of radiation thermometry. The blackbody problem arose because of the observation that when heating an iron rod, for example, it gives off heat and light. Its radiation may be at first invisible, or infrared, however it then becomes visible and red-hot. Eventually it turns white hot, which indicates that it is emitting all colors of the spectrum. The spectral radiation, which depends only on the temperature to which the body is heated and not on the material of which it is made, could not be predicted by classical physics. Kirchhoff recognized that "it is a highly important task to find this universal function." Because of its general importance to the understanding of energy, the blackbody problem eventually found a solution.

An Austrian physicist, Josef Stefan (1835-1893) first determined the relation between the amount of energy radiated by a body and its temperature. He was particularly interested in how hot bodies cooled and how much radiation they emitted. He studied hot bodies over a considerable range of temperatures, and in 1879 determined from experimental evidence that the total radiation emitted by a blackbody varies as the fourth power of its absolute temperature (Stefan's law). In 1884, one of his former students, Ludwig Boltzmann (1844-1906), determined a theoretical derivation for Stefan's experimentally derived law of blackbody radiation based on thermodynamic principles and Maxwell's electromagnetic theory. The law, now known as the Stefan-Boltzmann fourth-power law, forms the basis for radiation thermometry. It was with this equation that Stefan was able to make the first accurate determination of the surface temperature of the sun, a value of approximately 11,000°F (6,000°C).

The next quandary faced by these early scientists was the nature of the thermal radiation emitted by blackbodies. The problem was challenging because blackbodies did not give off heat in the way the scientists had predicted. The theoretical relationship between the spectral radiance of a blackbody and its thermodynamic temperature was not established until late in the nineteenth century.

Among the theories proposed to explain this inconsistency was one by the German physicist Wilhelm Wien and the English physicist John Rayleigh. Wilhelm Wien (1864-1928) measured the wavelength distribution of blackbody radiation in 1893. A plot of the radiation versus the wavelength resulted in a series of curves at different temperatures. With this plot, he was able to show that the peak value of wavelength varies proportionally with the amount of energy, and inversely with absolute temperature. As the temperature increases, not only does the total amount of radiation increase, in line with Stefan's findings, but the peak wavelength decreases and the color of the emitted light changes from red to orange to yellow to white.

Wien attempted to formulate an empirical equation to fit this relationship. The complex equation worked well for high frequency blackbody radiation (short wavelengths), but not for low frequency radiation (long wavelengths). Rayleigh's theory was satisfactory for low frequency radiation.



In the mid-1890s, Max Karl Ernst Ludwig Planck (1858-1947), a German physicist and a former student of Kirchhoff, and a group of Berlin physicists were investigating the light spectrum emitted by a blackbody. Because the spectrometer emitted distinct lines of light, rather than broad bands, they hypothesized that minute structures were emitting the light and began to develop an atomic theory that could account for spectral lines.

This was of interest to Planck because in 1859 Kirchhoff had discovered that the quality of heat radiated and absorbed by a blackbody at all frequencies reached an equilibrium that only depended on temperature and not on the nature of the object itself. But at any given temperature, light emitted from a heated cavity--a furnace, for example--runs the gamut of spectral colors. Classical physics could not predict this spectrum.

After several false starts, beginning in 1897, Planck succeeded in finding a formula predicting blackbody radiation. Planck was able to arrive at a formula that represented the observed energy of the radiation at any given wavelength and temperature. He gave the underlying notion that light and heat were not emitted in a steady stream. Rather, energy is radiated in discrete units, or bundles. Planck discovered a universal constant, "Planck's constant," which was founded on physical theory and could be used to compute the observed spectrum. This assumed that energy consisted of the sum of discrete units of energy he called quanta, and that the energy emitted, E, by each quantum is given by the equation E = hu = hc/l, where u (sec-1) is the frequency of the radiation and h is Planck's constant--now known to be a fundamental constant of nature. By thus directly relating the energy of radiation to its frequency, an explanation was found for the observation that higher energy radiation has a higher frequency distribution. Planck's finding marked a new era in physics.

Before Planck's studies, heat was considered to be a fluid composed of repulsive particles capable of combining chemically with material atoms. In this theory, the particles of heat entered a system and moved between the particles. A mutual repulsion of the particles of heat created a pressure. A thermometer detected this pressure. Planck's constant became known as a "fortunate guess." It allowed for theoretical equations which agreed with the observable range of spectral phenomena, and was fundamental in the theory of blackbody radiation.

Albert Einstein (1879-1955) studied the works of Maxwell and Helmholtz. In 1905, Einstein used the quantum as a theoretical tool to explain the photoelectric effect, showing how light can sometimes act as a stream of particles. He published three papers in volume XVII of Annalen der Physik. In one, he set forth his now famous theory of relativity, but another showed that a fundamental process in nature is at work in the mathematical equation which had resolved the problem of blackbody radiation.

Light, Einstein showed, is a stream of particles with a computable amount of energy using Planck's constant. Within a decade, this prediction confirmed experimentally for visible light.

Max Karl Ernst Ludwig Planck initiated quantum theory at the turn of the twentieth century and changed the fundamental framework of physics. Wrote Einstein, "He has given one of the most powerful of all impulses to the progress of science."

Today's Applications

The first patent for a total radiation thermometer was granted in 1901. The instrument used a thermoelectric sensor; it had an electrical output signal and was capable of unattended operation. In 1931, the first commercially-available total radiation thermometers were introduced. These devices were widely used throughout industry to record and control industrial processes. They are still used today, but mainly used for low temperature applications.

The first modern radiation thermometers were not available until after the second World War. Originally developed for military use, lead sulfide photodetectors were the first infrared quantum detectors to be widely used in industrial radiation thermometry. Other types of quantum detectors also have been developed for military applications and are now widely applied in industrial radiation thermometry. Many infrared radiation thermometers use thermopile detectors sensitive to a broad radiation spectrum and are extensively used in process control instrumentation.



Infrared thermometers currently are being used in a wide range of industrial and laboratory temperature control applications. By using non-contact temperature sensors, objects that are difficult to reach due to extreme environmental conditions can be monitored. They can also be used for products that cannot be contaminated by a contact sensor, such as in the glass, chemical, pharmaceutical, and food industries. Non-contact sensors can be used when materials are hot, moving, or inaccessible, or when materials cannot be damaged, scratched, or torn by a contact thermometer.

Typical industries in which non-contact sensors are used include utilities, chemical processing, pharmaceutical, automotive, food processing, plastics, medical, glass, pulp and paper, construction materials, and metals. Industrially, they are used in manufacturing, quality control, and maintenance and have helped companies increase productivity, reduce energy consumption, and improve product quality.

Some applications of radiation thermometry include the heat treating, forming, tempering, and annealing of glass; the casting, rolling, forging, and heat treating of metals; quality control in the food and pulp and paper industry; the extrusion, lamination, and drying of plastics, paper, and rubber; and in the curing process of resins, adhesives, and paints.

Non-contact temperature sensors have been used and will continue to be valuable for research in military, medical, industrial, meteorological, ecological, forestry, agriculture, and chemical applications.

Weather satellites use infrared imaging devices to map cloud patterns and provide the imagery seen in many weather reports. Radiation thermometry can reveal the temperature of the earth's surface even through cloud cover.

Infrared imaging devices also are used for thermography, or thermal imaging. In the practice of medicine, for example, thermography has been used for the early detection of breast cancer and for the location of the cause of circulatory deficiencies. In most of these applications, the underlying principle is that pathology produces local heating and inflammation which can be found with an infrared imager. Other diagnostic applications of infrared thermography range from back problems to sinus obstructions.

Edge burning forest fires have been located using airborne infrared imagers. Typically, the longer wavelengths of the emitted infrared radiation penetrate the smoke better than the visible wavelengths, so the edges of the fire are better delineated.

On the research front, one sophisticated infrared thermometry application is in the study of faults in metals, composites, and at coating interfaces. This technique is known as pulsed video thermography. A composite material consisting of a carbon-fiber skin bonded to an aluminum honeycomb is subjected to pulses of heat from a xenon flash tube. Infrared cameras record a frame-by-frame sequence of heat diffusion through the object, which is displayed on screen. Defects show up as deviations in the expected patterns for the material being tested.

Among the military applications of radiation thermometry are night-vision and the "heat-seeking" missile. In the latter case, the operator simply launches the missile in the general direction of the target. On-board detectors enable the missile to locate the target by tracking the heat back to the source. The most widely known military infrared missile applications are the Sidewinder air-to-air missile and a satellite-borne intercontinental ballistic missile (ICBM) detection system.

Both rely on detecting the infrared signature of an emission plume or very hot exhaust engine. The Sidewinder missile guidance system is shown schematically in Figure 1-5. A special infrared dome protects the optical system inside. The optical system consists of a primary and secondary mirror and a set of correction lenses to cause an image to focus onto a special reticle. All the light from the reticle is focused onto a detector (Figure 1-6). The reticle can modulate the radiation to distinguish between clouds and provide directional information.

Portable surface-to-air missiles, SAMs, are effective defense units that guide themselves to a target by detecting and tracking the heat emitted by an aircraft, particularly the engine exhaust.
We noticed you are from the US. Visit your local site for regional offers and live support.