ABSTRACT

All objects at temperatures above absolute zero emit electromagnetic radiation. Radiation thermometry makes use of this fact to estimate the temperatures of objects by measuring the radiated energy from selected regions. Thermal imaging takes the process one stage further and uses the emitted radiation to generate a picture of the object and its surroundings, usually on a TV display or computer monitor, in such a way that the desired temperature information is easily interpreted by the user. All thermal imagers must have a detector or array of detectors sensitive to radiation in the required waveband, and optics to form an image of the object on the detector. The optimum waveband for thermal imaging is determined partly by the wavelength distribution of the emitted radiation, partly by the transmission of the atmosphere, and partly by the chosen detector technology.