Development of GC/MS in Environmental Analyses
Increasing computer capability yields improved methods
As computing technologies have become more readily available, access has spread into the arena of analytical instrument control, and has made sophisticated data handling
![]() click to enlarge Figure 1: Increase in processor power and parallel GC/MS development.1 |
features at the instrument level possible. Computer technology and instrumentation for analytical measurements have become firmly linked. This is especially true of techniques such as gas chromatography/mass spectrometry (GC/MS), which combines separation of organic molecules with mass spectrometric detection, and is used for the measurement of organic chemicals in many types of materials for a variety of industries. GC/MS is data-intensive and requires large computing capacity in order to scan the peaks frequently enough to obtain and store the information necessary to accurately identify compounds of interest.
Development milestones
From the time gas chromatography (GC) became available in the mid-1950s, hardware development was very intensive. It peaked in the early 1960s as integrators and capillary columns were introduced. In the late 1970s and early 1980s, another round of GC development peaked, based on software development to control the hardware and add automation capabilities.
GC/MS was first introduced as a commercial technique in the early 1960s and took up almost a small room’s worth of space. Data collection at that time was an art, and interpretation was extremely manual- and labor-intensive. Furthermore, the price was several hundreds of thousands of dollars. Fortunately, over the past 35 years, speed, convenience, data handling and space consumption have improved by several orders of magnitude. The increase in computer power since the early days, as described by the increasing number of transistors in the processor chip, is shown in Figure 1. Also indicated in Figure 1 are some of the parallel milestones in GC/MS development. Today, one can buy a tabletop quadrupole GC/MS with low pumping capability for less than $60,000.
Changing market forces
In the early 1990s, the U.S. environmental analysis market went into a tailspin. Although the number of samples that required analysis grew during this period, it was a difficult time for the industry, as laboratory closures and layoffs were widespread. The market was highly regulated, and new regulations slowed at the same time that computer-driven improvements drastically increased laboratory throughput. Analytical capacity was high because the market had been very attractive and the barrier to entry was achievable. As a result, market growth in the U.S. went from +25 percent in 1988 to -5 percent in 1994.2 Prices dropped dramatically, with the price for a benzene-toluene-xylenes analysis falling 14 percent in one year, from 1995 to 1996.3
As the market began to consolidate, productivity became the key differentiator between laboratories that survived and those that did not, especially because labor costs were a large part of their business expense. The U.S. market eventually stabilized at about 800 commercial laboratories from the earlier 1,200. During this period, laboratories began to appreciate that analyses using GC/MS were more productive than GC using a specific detector. GC analyses needed to be performed on two different chromatography columns in order to verify the presence of the compound by utilizing the correct different retention times on each column. In contrast, GC/MS analyses gave a confirmation in a single analytical run because the GC retention time for a peak would be confirmed by the mass spectral identification of the compound, thus improving productivity by a factor of two, a major gain. It also increased the quality of the results.
Environmental monitoring
The U.S. Environmental Protection Agency (EPA) was created in December of 1969, and routine monitoring developed over the next 10 years. The EPA had a need to evaluate exposure to a large list of organic compounds in a variety of matrices, including soil, water and air. Regulations grew out of health-based assessment of a candidate list of compounds, labeled the priority pollutant list.
In the early days of GC/MS, many environmental laboratories used the technique only for special situations, such as the measurement of dioxin, where other techniques did
![]() click to enlarge Table 1. U.S. EPA Methods Using GC/MS |
not give the necessary component separation or sensitivity. Over time, most routine analyses continued to be performed with GC using specialized detectors. For example, pesticide measurements continued to be collected using method 8081A with GC and an electron capture detector (ECD), which is very specific and sensitive for compounds containing halogens.
Eventually, increasing sensitivity of GC/MS, in combination with changes in the market, drove analyses previously performed with the GC/specialized detector to GC/MS. Use of the technique in environmental laboratories for the analysis of potentially hazardous pollutants in air, water and soil has been growing, providing rapid multi-dimensional information on compounds such as trichloromethanes in drinking water and polychlorinated biphenyls in soil. Table 1 shows GC/MS methods that have been developed for various EPA programs. In many cases, each of the separate EPA programs has developed their own method for similar analytes, but with different matrices and quality assurance/quality control.
Complex challenges
GC/MS methods, such as 8270C, are extremely challenging, with a comprehensive analyte list of more than 200 compounds, including pesticides listed in method 8080A. As a result, quality control tests are included to ensure that systems generate data of known and documented quality.
Samples arising from the Resource Conservation and Recovery Act (RCRA) program are generally challenging matrices, such as dirt, wastes and contaminated waters, contributing to sample complexity. Newer software features, such the ability to combine full scanning capability for library searching and single ion monitoring for superior detection limits, further promote productive data generation for a wide concentration range. Used effectively on modern instrumentation, this method produces comprehensive information on complex samples.
Conclusion
The ability of GC/MS to serve the environmental industry with higher-quality and more comprehensive analyses is a direct result of increasing computer capability. As the industry need for productivity has grown, and the instrumentation has become more affordable, the technique has increased in popularity and has become firmly established as an indispensable tool in the laboratory. Combined with automatic identification of compounds through automated spectral search and match, GC/MS has become the industry standard for many environmental analyses. In addition, ease of use, allowing less skilled operators to use the technique, and automation have both contributed to lowering the cost of ownership to further enhance its value.
References
1. Heller, S.R. Today’s Chemist at Work, 1999, 8(2) 45.
2. Environmental Testing and Analytical Services, Overview of Markets and Competition, Environmental Business International, San Diego California, 2004.
3. Environmental Laboratory Business Planning Study — 1998, Miller and McConnaghy, LLC, Cary, NC 1997.
Zoe Grosser is the segment marketing manager for Analytical Sciences in the PerkinElmer Life and Analytical Sciences Division. She may be contacted at [email protected]