Spectroscopy and Spectrochemistry, Visible and Ultraviolet

Spectrum analysis was launched in 1859-1860 by the physicist Gustav Robert Kirchhoff and the chemist Robert Wilhelm Bunsen. They demonstrated experimentally that each chemical element has its own characteristic set of spectrum lines, which it emits or absorbs when heated to the state of a radiating gas by a Bunsen-burner flame, an electric arc or a spark.

Within the context of nineteenth century science, the various patterns in these spectra could only be described, catalogued and mapped—and quite extensively so. An understanding of these patterns of series, bands and the splitting of these lines into components by an electric or magnetic field (Stark and Zeeman effect) had to wait until the twentieth century.

Niels Bohr’s atomic model of 1913 interpreted spectrum lines as the result of electron jumps between stable orbits around the nucleus. Quantum mechanics abandoned orbits but kept the notion of energy levels henceforth to be calculated on the basis of Schrodinger’s equation, leading to an even better agreement between observations and its theoretical predictions concerning the transition terms (selection rules) and energies (line frequencies).

The dramatic success of spectrum analysis after 1860 caused this qualitative analytic technique to make quick inroads into the chemist’s or pharmacist’s laboratory, the astronomer’s observatory, the physician’s hospital, and even the judge’s court. The most notable use of the new technique involved identifying the presence of the various elements in a given sample. It led to the surprising finding that the metal lithium, for instance, hitherto considered quite rare, was among the more ubiquitous chemical elements.

News that it was possible to decipher the chemical composition of the sun and ultimately the stars spread fast: Who could fail to be impressed by the fact that only microscopic amounts of sodium (3 x 10-9 g)— a mere teaspoon of salt in a full swimming pool— were needed for detection of its characteristic yellow D lines in a Bunsen flame? Using a simple pocket spectroscope, a steel caster could now easily identify the exact instant of decarbonization of molten steel, the moment just before it loses its fluidity.

Quantitative emission spectroscopy had a much more difficult start, however. Around 1906, Comte Arnaud de Gramont started to record methodically the detectability of characteristic lines that remain visible as long as the slightest trace of a substance is present. These raies ultimes—ultimate or residuary lines—were the most reliable indicators of the respective chemical elements in a sample.

For 83 different elements including (besides most metals) a great many rare earths and nonconducting elements, de Gramont listed the low number of 307 ultimate and penultimate lines. As a regular analyst for four French steel mills, de Gramont helped improve their production significantly because he could readily report to them the presence and approximate concentrations of aluminum, boron, cobalt, chromium, copper, manganese, molybdenum, nickel, silicon, titanium, vanadium, and tungsten in their steel.

During World War I, de Gramont and a few assistants used the method in a broad array of military applications. Among them were quick and efficient examinations of the structural frames or valves of zeppelins, shrapnel from long-range guns, or ignitors of aircraft.

The spectroscopic laboratory of the National Bureau of Standards in Washington was among the very first to adopt de Gramont’s method of ‘‘practical spectrographic analysis.’’ After the war, William F. Meggers applied it to the chemical analysis and quality control of noble metals, such as gold and platinum, for the U.S. Mint in San Francisco. Promising analytical and metallurgical applications were explored by the American Brass Company, in Waterbury, Connecticut, as well as by a few other U.S. industrial laboratories.

Despite the obvious importance of de Gramont’s work for the French war machine, strangely enough, Germany did not implement anything even remotely similar. Carl Friedrich (called Fritz) Lowe was one of the earliest active promoters of its industrial applications among chemists and physicians. Lowe’s frankness about the missed opportunities during the recent—lost—World War was effective in arousing renewed interest in the method. His touting of instruments by the Zeiss Company in Jena reached German-speaking audiences.

Frank Twyman fulfilled a similar promotional function in the Anglo-Saxon world for his company, Adam Hilger Ltd. in London, and the Stockholm professor of experimental biology Henrik Gunnar Lundegardh pointed out many applications in mineralogy, biochemistry, plant physiology, and agricultural chemistry. By 1930, a typical spectro- chemical procedure took no longer than 20 minutes (including development of the photographic plate). A decade later, the industrial pressure for ever-higher production rates had ‘‘super-speed analysts’’— as Meggers called them—reduce this time to a minute or two.

The laboratories of Ford Motor Company, for instance, carried out large numbers of analyses at high speed: samples were sent by pneumatic tube from the foundry to the spectrographic laboratory and just a few minutes after receipt of the sample, the results were available back on the factory floor. This method left the sample virtually unscathed and allowed close examination of local differences of parts of the surface or various layers of it. By contrast, wet chemical analysis inevitably yielded average results because the sample had to be analyzed in solution.

Other applications of spectro- chemical analysis after 1930 include:
- Absorption spectrophotometry of organic solutions for identification of hormones, vitamins, and other complicated substances.
- Testing for silver or boron content in the mining industry.
- Routine quality control in the metallurgical and chemical industries, including monitoring of isolation or separation processes.

- Soil analysis for agriculture and plant physiology.
- Applications in the food packing industry (e.g., checking the dissolving rate of inner coating of a tin can by measuring two or three parts of aluminum or lead per ten million, or testing chocolate or chewing gum wrappers, and whiskey distilling vats).

- Forensic analyses or autopsies for detection of trace amounts of toxins (e.g., thallium from rat poison, which is ascertainable in hair samples).
- Analysis of fusible alloys of tin for safety valves or fire sprinklers (to trace impurities such as lead and zinc, which may raise the melting point by undesirable amounts if present in proportions of as little as one part in ten thousand).

- Archaeometric comparisons of the precise composition of metals and alloys from various locations (sometimes enabling archeologists to infer where a certain piece had been manufactured, or conclusions about the geographic and temporal spread of certain technologies or skills).

- Plentiful applications in mineralogical analysis (which, as we have seen in the case of de Gramont, had initiated some of the earliest efforts in quantitative spectrochemical analysis).

The plethora of possibilities turned spectrochemistry into a vibrant and popular field. The industrial world embraced it in the following decades, setting up thousands of spectrochemical laboratories. The boom in this field of research can be gauged somewhat by publication statistics in spectrochemistry: 1467 books and papers, and half a dozen treatises were indexed in the first part of Meggers’ and Scribner’s bibliographic survey, covering 1920-1939. A total of 1044 contributions were made in the short period of World War II, 1940-1945, another 1264 in the next five postwar years, and 1866 in the period 19511955.

A true explosion in the literature followed, with an exponential growth in many scientific fields leading to an estimated total of 10,000 spectro-chemical publications by 1963.

Visual resources like atlases were an integrated part of the effective marketing strategy of the major spectrograph manufacturers: Zeiss, Hilger, or Fuess. The most ambitious inventorization effort was the famous Massachusetts Institute of Technology (MIT) table of 100,000 wavelengths. It was compiled with specially developed spectrophotometers capable of automatically measuring, computing, and recording the wavelengths of spectrum lines, thus speeding up these operations some 200-fold. As one of the major teaching and research centers for spectroscopy, the MIT started to host annual summer conferences on spectroscopy in 1933.

An initial attendance of 69 persons in the first year increased to 233 in 1938, 250 in 1939, and 302 in 1942. The series was interrupted for the remaining war years but resumed thereafter. The rapidly expanding market for spectrographs and spectrometers led to the initiation of specialized events such as the National Instrument Conference and Exhibit. An overlapping interest in spectrochemical instrumentation and techniques motivated the Society for Analytical Chemistry of Pittsburgh (SACP, founded in 1943) and the Spectroscopy Society of Pittsburgh (SSP, founded in 1946) to combine their annual meetings in 1949.

The joint meetings of these hitherto moderately sized societies, held every March since 1950 under the acronym Pittcon (Pittsburgh Conference and Exposition on Analytical Chemistry and Applied Spectroscopy), transformed spectrochemistry to the point that the convention eventually outgrew this steel-producing city and its organizers were forced to find other locations.

Whereas the first Pittsburgh Conference offered 56 presentations and 14 exhibits by commercial instrument makers, the 1990 conference (held at the Jacob Javits Center in New York) coordinated more than 1200 talks and 25 symposia, over 3000 instrument exhibits by over 800 commercial instrument makers, and 12,500 hotel bookings.

Both the high demand for spectrochemical techniques during World War II and the ubiquitous pressure for ever-faster results led inevitably to increased substitution of quasi-instantaneous photoelectric detection in photographic recording. This elimination of photographic development and densitometry in favor of photomultipliers and electronic automation was pushed particularly in the U.S. in companies like Dow Chemical Company in Midland, Michigan, Perkin Elmer in Boston, Baird Associates (BA) in Cambridge, Massachusetts, Applied Research Laboratories (ARL) in Glendale, California, and National Technical Laboratories, renamed Beckman Instruments in 1950, whose direct-reading spectrometers flooded the international market in the 1950s.

Advertisers claimed these ‘‘analysis automats’’ made ‘‘all routine spectrochemical analyses with dispatch and precision,’’ and in the 1950s and 1960s they eventually did. With these improvements also came a rapid expansion of potential applications, especially in infrared spectroscopy.

The near-ultraviolet (UV) had already been explored photographically by Eleuthere Mascart and Alfred Cornu in the nineteenth century. However, glass optics absorb radiation past 3440 A (1 angstrom = 10-10 meters), which could be circumvented by using quartz or Iceland spar prisms; ozone absorbs wavelengths past 2900 A; and the gelatin emulsions of photographic plates those past 1850 A.

Therefore further progress had to await the development of high-vacuum spectrographs and gelatin-free emulsions. The latter two fields were pioneered by Victor Schumann in Leipzig, who reached wavelengths down to approximately 1000 A, and Theodore Lyman at Harvard University who discovered the ultraviolet series of hydrogen in 1914. After World War II, grating spectrographs were mounted on rockets and propelled out of the terrestrial atmosphere to record high-resolution solar UV-spectra.

 






Date added: 2023-11-02; views: 226;


Studedu.org - Studedu - 2022-2024 year. The material is provided for informational and educational purposes. | Privacy Policy
Page generation: 0.015 sec.