Before the year 1800, the existence of the infrared portion of the electromagnetic spectrum wasn’t even suspected. The original significance of the infrared spectrum, or simply the infrared as it is often called, as a form of heat radiation is perhaps less obvious today than it was at the time of its discovery by Herschel in 1800.
The discovery was made accidentally during the search for a new optical material. Sir William Herschel – Royal Astronomer to King George III of England, and already famous for his discovery of the planet Uranus – was searching for an optical filter material to reduce the brightness of the sun’s image in telescopes during solar observations. While testing different samples of coloured glass which gave similar reductions in brightness he was intrigued to find that some of the samples passed very little of the sun’s heat, while others passed so much heat that he risked eye damage after only a few seconds’ observation.
Herschel was soon convinced of the necessity of setting up a systematic experiment, with the objective of finding a single material that would give the desired reduction in brightness as well as the maximum reduction in heat. He began the experiment by actually repeating Newton’s prism experiment, but looking for the heating effect rather than the visual distribution of intensity in the spectrum. He first blackened the bulb of a sensitive mercury-in-glass thermometer with ink, and with this as his radiation detector he proceeded to test the heating effect of the various colours of the spectrum formed on the top of a table by passing sunlight through a glass prism. Other thermometers, placed outside the sun’s rays, served as controls. As the blackened thermometer was moved slowly along the colours of the spectrum, the temperature readings showed a steady increase from the violet end to the red end.
This was not entirely unexpected, since the Italian researcher, Landriani, in a similar experiment in 1777 had observed much the same effect. It was Herschel, however, who was the first to recognise that there must be a point where the heating effect reaches a maximum, and that measurements confined to the visible portion of the spectrum failed to locate this point. Moving the thermometer into the dark region beyond the red end of the spectrum, Herschel confirmed that the heating continued to increase. The maximum point, when he found it, lay well beyond the red end – in what is known today as the infrared wavelengths.
When Herschel revealed his discovery, he referred to this new portion of the electromagnetic spectrum as the thermometrical spectrum. The radiation itself he sometimes referred to as dark heat, or simply the invisible rays. Ironically, and contrary to popular opinion, it wasn’t Herschel who originated the term infrared. The word only began to appear in print around 75 years later, and it is still unclear who should receive credit as the originator. Herschel’s use of glass in the prism of his original experiment led to some early controversies with his contemporaries about the actual existence of the infrared wavelengths. Different investigators, in attempting to confirm his work, used various types of glass indiscriminately, having different transparencies in the infrared. Through his later experiments, Herschel was aware of the limited transparency of glass to the newly-discovered thermal radiation, and he was forced to conclude that optics for the infrared would probably be doomed to the use of reflective elements exclusively (i.e. plane and curved mirrors). Fortunately, this proved to be true only until 1830, when the Italian investigator, Melloni, made his great discovery that naturally occurring rock salt (NaCl) – which was available in large enough natural crystals to be made into lenses and prisms – is remarkably transparent to the infrared. The result was that rock salt became the principal infrared optical material, and remained so for the next hundred years, until the art of synthetic crystal growing was mastered in the 1930’s.
Thermometers, as radiation detectors, remained unchallenged until 1829, the year Nobili invented the thermocouple. (Herschel’s own thermometer could be read to 0.2 °C (0.036 °F), and later models were able to be read to 0.05 °C (0.09 °F)). Then a break-through occurred; Melloni connected a number of thermocouples in series to form the first thermopile. The new device was at least 40 times as sensitive as the best thermometer of the day for detecting heat radiation – capable of detecting the heat from a person standing three meters away. The first so-called heat-picture became possible in 1840, the result of work by Sir John Herschel, son of the discoverer of the infrared and a famous astronomer in his own right. Based upon the differential evaporation of a thin film of oil when exposed to a heat pattern focused upon it, the thermal image could be seen by reflected light where the interference effects of the oil film made the image visible to the eye. Sir John also managed to obtain a primitive record of the thermal image on paper, which he called a thermograph.
The improvement of infrared-detector sensitivity progressed slowly. Another major breakthrough, made by Langley in 1880, was the invention of the bolometer. This consisted of a thin blackened strip of platinum connected in one arm of a Wheatstone bridge circuit upon which the infrared radiation was focused and to which a sensitive galvanometer responded. This instrument is said to have been able to detect the heat from a cow at a distance of 400 meters. An English scientist, Sir James Dewar, first introduced the use of liquefied gases as cooling agents (such as liquid nitrogen with a temperature of –196°C (–320.8°F)) in low temperature research. In 1892 he invented a unique vacuum insulating container in which it is possible to store liquefied gases for entire days. The common thermos bottle, used for storing hot and cold drinks, is based upon his invention. Between the years 1900 and 1920, the inventors of the world discovered the infrared. Many patents were issued for devices to detect personnel, artillery, aircraft, ships – and even icebergs. The first operating systems, in the modern sense, began to be developed during the 1914–18 war, when both sides had research programs devoted to the military exploitation of the infrared. These programs included experimental systems for enemy intrusion/detection, remote temperature sensing, secure communications, and flying torpedo guidance. An infrared search system tested during this period was able to detect an approaching airplane at a distance of 1.5 km (0.94 miles), or a person more than 300 meters (984ft.) away. The most sensitive systems up to this time were all based upon variations of the bolometer idea, but the period between the two wars saw the development of two revolutionary new infrared detectors: the image converter and the photon detector. At first, the image converter received the greatest attention by the military, because it enabled an observer for the first time in history to literally see in the dark. However, the sensitivity of the image converter was limited to the near infrared wavelengths, and the most interesting military targets (i.e. enemy soldiers) had to be illuminated by infrared search beams. Since this involved the risk of giving away the observer’s position to a similarly-equipped enemy observer, it is understandable that military interest in the image converter eventually faded. The tactical military disadvantages of so-called active (i.e. search beam-equipped) thermal imaging systems provided impetus following the 1939–45 war for extensive secret military infrared-research programs into the possibilities of developing passive (no search beam) systems around the extremely sensitive photon detector. During this period, military secrecy regulations completely prevented disclosure of the status of infrared-imaging technology. This secrecy only began to be lifted in the middle of the 1950’s, and from that time adequate thermal-imaging devices finally began to be available to civilian science and industry.