majestic moutnains banner

Environmental Perspectives

I am an environmental engineer and was a consultant on that topic for 45 years. I am sharing some of what I've learned through 3 lectures that I'll give to whoever wants to listen and wants to pay for my travel. This lecture is the most technical of the three and is about topics I practiced every day. I will assume you're here because you want this level of detail, as opposed to my two other more “populist” lectures, so I'll get right into it.

The topics I will cover here are:

  1. Interesting facts
  2. How the environment is tested
  3. What is done with data
  4. Environmental regulation
  5. Limitations
  6. NIMBY
  7. Product Safety

The last two are simply bugs in my bonnet for which I have proposed solutions. Given our time limits, I assume you want to go to sleep tonight and not by my doing, so I will only give you a taste of each topic. My objective is to make you more informed and skeptical about what deserves it. Some of you may know more about some of these topics than I do. If so, please go easy on me; I'm getting old.

1 - Interesting Facts

Facts about the environment differ depending where you get your information, but they are fascinating, and hopefully I got these right (US):

  • 350 Billion gallons of water daily, 7% of all precipitation.
    • 23% from groundwater
    • 7% to homes of 258 Million people through 160,000 public water supplies,
    • 40% to agriculture
    • 140 Billion gallons per day to power plants (plus 60 Billion saline)
    • 32 Billion gallons per day treated at 16,000 sewage treatment plants
  • 20% of sewage is treated “onsite” (septic systems)
  • Million tons/yr of waste: 7,600 industrial, 250 municipal, 25 hazardous
    • Million tons = largest ship ever made, Giza = 6 Million tons
    • 1,900 landfills, 21 for hazardous waste
  • Food supply requires 1.2 acres per person
  • 100 nuclear reactors; 2,000 tons/yr nuclear waste
  • 97 Quads (quadrillion BTU) per year
    • 36% petroleum
    • 26% gas
    • 20% coal
    • 9% renewables
    • 8% nuclear

2 - How the Environment is Measured

In order to manage the environment, we have to measure it. And we have to understand what the measurements mean, but that is a different topic. Environmental measurement is a 2-part problem: sampling and analysis.


The objective of sampling is to understand large areas with as few samples as possible. We call this “representativeness.” Sometimes a sampling event can be designed to be appropriately representative the first time around; sometimes it takes a first round to yield poor statistics which motivate a more representative second round. Either way, sampling, and analysis (discussed next) can be expensive. I worked on river sampling projects that cost over $100 Million, each.

Environmental sampling is a 3-dimensional problem, which makes representativeness even more complicated. For example, soil sampling must consider depth - shallow soils because people are exposed to them, at the water table because it's the first encounter with a that medium, or in multiple aquifers because there are various routes of contaminant transport. Another example is air sampling - ground level measurements affect people, higher elevations affect pollution transport. This brings us to the most important element of environmental sample - a clear statement of objectives. Exactly what is the sampling aimed at? In my experience, every data location, the data frequency, and every type of measurement must have an exact rationale. Otherwise we are left with boxes of data that no one knows what to do with. I've seen this dozens of times.

There are many types of sampling objectives. A few examples are:

  • Clean or dirty (or how dirty).
  • Offsite vs onsite.
  • Pollution release or transport rates or distance.
  • Exposure (humans or animals).
  • Time trends (up or down).
  • Time variability (cycles?).

The last example, background, is probably the second most important consideration for sampling - what is “normal.” I would say only half of the sampling programs I've seen, and I've seen hundreds, properly considered this important factor. But what is background? We consider background as either “natural,” meaning pristine, or “anthropogenic.” Urban soils have generally been affected by man's activities, and have an anthropogenic background.

Once we understand what we are looking for - we have clear objectives and know what's normal - we can hopefully design a representative sampling program. We must specify the number and location of each sample with location having 3 dimensions. We must specify frequency - this will become more obvious with good objectives. We must specify what will be measured in the sample, which will dictate how big a sample we take and how we take it. For a truly representative sample, the sampling method must not alter the sample. EPA and other agencies have written hundreds of guidance documents about sampling methods, and I still found instances of the need to figure out how best to do it under certain circumstances. From all these considerations, it is hopefully obvious that the best environmental sampling programs are designed by people who know what they are doing - can set clear objectives, understand what to expect, know the methods, and can adlib when needed.

Some of the more common general sampling strategies are called “grid,” “random,” and “transect” sampling. There are whole textbooks written about them. One of the most common sampling mistakes I've seen is that the obvious is either oversampled or undersampled and then that bias is overlooked when considering the results. I was what might be called a data interpreter, and I would pay close attention to the sampling design and, to some extent, the sampling methods, when considering what the data meant.

Some of the things I would think about when considering the representativeness of a data set were:

  • Data density - sometimes one sample in an acre is enough, while other times 20 samples in a backyard are not enough. It depends on what you expect for special variability and how the data will be used. See big data.
  • Timing and frequency - if a release is cyclical and the samples don't accommodate that, entirely wrong results may be measured. For example, if you're interested in how much light hits the earth and you sample at midnight, you won't know. Similarly, if the concern is daily and the samples were weekly, the results might be meaningless.
  • Parameters - “parameters” are the qualities being measured. If the concern is specific toxic chemicals, but the sampling was for, say iron, the sampling is not helpful.
  • Detection limits - if the concern starts at let's say a level of 1, but the analysis can only see down to a level of 10, the results can't possibly address the concern.

There a many more sampling issue, but I hope you get the gist.

Sponsored Content


This last example shifts us to the second part of environmental measurement - analysis. Once we have a representative sample, what do we do with it? Environmental analysis has come a long way in the last 100 years - from what we call “wet chemistry” to electronic “instrument analysis.”

When I started my career in water pollution, wet chemistry prevailed and what was typically measured was pretty fundamental - acidity, turbidity, suspended solids, acidity, dissolve oxygen, a few trace metals like chromium, and a general parameter of interest for a number of reasons called “biochemical oxygen demand” (BOD). What we think about as “chemical contamination” today, was typically measured with a wet chemistry test for “phenols” and for “oil and grease.” As you might guess, such measurements led to a fairly primitive definition of pollution.

Today we have instrument chemistry - gas chromatography, mass spectroscopy, atomic absorption spectrophotometry, and inductively coupled argon plasma spectroscopy, to name a few. These instruments are very specific and very sensitive, so for example, we can measure 1 part per quadrillion of 2,3,7,8-tetrachlorodibenzodioxin (TCDD) in water. One ppq is about the thickness of a hair compared to the distance to the moon. TCDD is a specific environmental contaminant that is particularly toxic. Now, we can even measure groundwater elevations from satellites in space. Our ability to analyze samples from our environment and to deduce things about our environment from measurement and better understanding is now absolutely astounding. But let me back up and explain how some of the pollution analytical devices work.

Just as environmental sampling has 2 parts, sample analysis has 2 parts - sample preparation and measurement. (I try to simplify everything into 2 parts). Sample preparation takes the sample from the field, such as dirt in a bottle, and carefully (meaning no important alterations) makes it ready for the instrument to measure. Many instruments need inputs in liquid form, sample preparations typically do this. EPA and agencies have dozens of guidance documents and regulations about analytical methods to be used for environmental samples. An example sample preparation would be extraction of the contaminants in the soil sample into a solvent that can be injected into the instrument. Sounds simple, but it's not. The instruments do 2 things - identify the compound of interest and measure its quantity, which then gets converted to a concentration in the sample by keeping track of how you prepared the sample. Some of the important instrument analyses are:

Gas Chromatography: This is the workhorse for organic (carbon-based) chemical analysis. It takes that injected sample preparation and spreads it out as it moves through a heated tube so that different chemicals come out the other end at different times based on their specific chemical properties. That “spread” is called chromatography, derived originally from the spread different colors made across an absorbent “paper.” In a chromatogram, the instrument's output, a chemical like benzene will exit the instrument much sooner than a chemical like TCDD so we know when to look for what when we do the next step of measuring quantity. (We know “when” because we ran known “standards” of the chemicals previously and their “when” never changes.”)

Mass Spectroscopy: Mass spectrometers are put at the end of gas chromatographs to help with chemical identification. They do this by smashing the molecule, which breaks into predictable pieces that allow for mother chemical identification.

Detectors: Having identified the compound, we must next measure its quantity. This is done by putting detectors at the end of the chromatograph. Detectors are instruments that respond to a chemical's property in proportion to its quantity. For example a flame ionization detector measures ions formed when the gaseous chemical is burned. Electron capture detectors measure the electrons released from a chemical after it is bombarded with radioactive particles.

With its detectors, the output of this analysis is called a gas chromatograph, which is a graph of peaks where the distance of the peak from the origin, the time of injection and called the “residence time,” identifies the name of the chemical and the area under the peaks allows calculation of the chemical's quantity (by comparison to that of analytical standards injected previously). The majority of organic chemicals in the environment are measured with gas chromatography.

Atomic Absorbtion Spectroscopy: Spectroscopy is the analysis of spectral colors created when chemicals are behaving in certain ways - by burning for example. Colors that get created can be measured, called emission spectroscopy, as can colors that disappear, called absorption spectroscopy. Trace metals, when heated, make certain, reproducible wavelengths of light disappear from the spectrum of light and this disappearance is used to identify the exact trace metal and its quantity. That is how elements such as chromium, cadmium, and lead are measured. Inductively coupled argon plasma spectroscopy is based on the same general principles, but does the job in a different, often more efficient, way. By the way, spectroscopy based on similar principles is how astronomers measure the age of stars.

The instruments are powerful and sensitive. Based on regulations and scientific knowledge, commercial laboratories have an established suite of chemicals they typically look for in environmental samples. For example, EPA under the Clean Water Act established 129 “Priority Pollutants” in 1976, a list of which is a typical analytical suite that you can order as a package. Often, labs are asked to go beyond these standard lists and report the next 10 or 20 most prevalent compounds, called “Tentatively Identified Compounds.” They do this by comparing their mass spec data to computer “libraries” of the mass spectrum of thousands of chemicals, aided by a computer of course. I wrote a paper in 1988 demonstrating that the computer matches were often wrong, so a live chemist better check. I hope they do.

Once the samples are taken and analyzed, data interpreters consider several qualities about the data, including the detection limit, precision, and accuracy, in addition to the overall sampling design and rationale. A detection limit is the lowest level that can be measured. There are various types of detection limits, analogous to hearing sound, recognizing it as speech, and understanding its content. Depending on the lab and/or the chemical, detection limits can vary by thousands. Reported concentrations close to detection limits are probably not accurate, those 10 times or more over the detection limit are more accurate.

Accuracy is the ability to be right. An accurate measurement gets the value correct. 10 is real close to 10.

Precision is the ability to reproduce a result. If you had, say, 3 measurements, all 3 would be quite close to each other if the measurements were precise. Laboratories have routines for demonstrating their detection limits, accuracy, and precision and a thorough interpretation of environmental data includes review of these “quality control” data.

One final facet of environmental analysis I will share is for environmental parameters that are actually related groups of chemicals. It is very difficult to measure these groups accurately, but some of them are very important environmental contaminants. One such group is called PCBs, which stands for polychlorinated biphenyls. PCBs are a group of 209 chemically similar compounds, but the measurement of all 209 individually is often not practical, and sometimes is not meaningful. I spent years on PCBs and won't go into much more detail, but suffice it to say, their measurement is tricky and always should be reviewed by someone who understands both their nature and their measurement.

3 - Data Analysis

The 2-parts of data analysis are: 1) organizing it; and 2) understanding it. Organizing environmental data properly helps one to understand it. Organizing it consists of storing it properly and displaying it effectively. Understanding it involves comparing it to regulatory standards and concentration thresholds believed to create impacts such as health effects, fish kills, ecosystem damage, or aesthetic effects. Further understanding is gained by applying environmental knowledge about how the chemical should behave based on its solubility and things like that. This is where the rubber meets the road, where we decide what it means.

Organizing environmental data starts with understanding what it is. Since most of you are not practitioners, I won't give you the details for making a database, but rather I will tell you the general structure of environmental data which will help you think about it. Environmental data come in - how many, you guessed it - 2 parts: data and metadata. What you might typically call the “data” are the quality measures, as you might guess - the concentration, the temperature, the pressure, the hydraulic characteristics, such as conductivity, and so on. The metadata is the information needed to put the data in context, such as depth, date, sample type, and so on. The units of measurement are also critical elements of the data and metadata - such as mg/L for concentration or ft bgs for depth. Without knowing the units, the data are useless. I once reviewed a report that had no units so I called the author and he asked, “what's that?” Not everyone knows what they're doing.

RELATED: Environmental Data Analyst Careers

Sometimes the units tell you something about the data. For example concentration units of mg/L tell me it's a water sample compared to mg/kg, which is usually a soil concentration. Non-scientists and imprecise scientists use a more layman concentration unit of “parts per million” which tells you nothing about the sample type and can be interpreted incorrectly for air samples. For concentration, use the metric system of mass per unit volume or mass per unit mass if you want a job from me in environmental science.

Typically an environmental data “point” (datum) may have a half dozen or dozen critical pieces of information attached to it, metadata, such as sample type, date, units, corresponding temperature, and so on.

In today's regulatory world, we tend to think of contaminants in groups defined by the regulations - volatile organic chemicals (VOC), semivolatile organic chemicals (SVOC), trace metals, PCBs, pesticides, and chlorinated organics, to name a few. I named the Priority Pollutants a moment ago, which are 129 chemicals of interest under the Clean Water Act, RCRA, one of two federal laws governing hazardous wastes has several hundred “Appendix IX” groundwater monitoring compounds. Sometimes these groups and subgroups are defined by the way they are analyzed, sometimes they have somewhat similar properties, and sometimes they have similar environmental characteristics. Sometimes it's a mixture. For example, VOCs tend to be more soluble in water and one can order a “VOC lab analysis.” Sometimes these subgroups are further differentiated by their analytical method or governing regulation. For example EPA Method 8021 is for VOCs under RCRA, one list, but EPA Method 524 is for “purgeable” organic compounds in drinking water, same concept but different lists.

In general, environmental data are organized according to media, such as air or groundwater, analytical fraction, such as VOCs, and sampling round, such as 2014 data. Soil data are probably least sensitive to time. For example, 2010 soil data are likely to still represent conditions whereas 2010 groundwater data may not.

After the data are organized, we need to see what we have to start interpreting what they mean. I'm a firm believer in the graphical presentation of data. Two powerful graphical data tools are contour plots (fig), where a quality like concentration is given different colors for different ranges and, even better (fig) different colors and heights for concentration ranges. Another powerful graphical tool is posted data (fig) where all kinds of information can be shown connected visually to its location. This kind of plot lets one see the data and where the data are located. The latter is usually critical in understanding environmental conditions.

Patterns and trends are what we look for. A pattern of sources, transport, and extent of contamination, for example. A trend to determine if things are getting worse or better, is another example. As you can see, knowing what data analyses will be needed bear on the sampling event design. I always said to my clients that data interpreters are the best sampling program designers because they know what they will need.

Once the data are organized and displayed so we know where things are, we need to consider what it means? There are two considerations, sort of related:

  • How do the data compare to regulatory standards or triggers?
  • Is it safe?

I have another lecture called, “Is It Safe,” which deals with this second consideration in detail, but I will give you the Cliff Notes here.

More on environmental regulation later, but here for data interpretation, the important point I want to make is how we compare concentrations to pertinent regulatory standards or criteria. For example, we compare:

  • Drinking water measurements to Federal drinking water standards, called MCLs (“Maximum Contaminant Limits”)
  • Air measurements to Federal NAAQS (“National Air Quality Standards”)
  • Surface water measurements to classification standards (more about that next)

In some cases, there are no standards. For example, there are no standards for groundwater quality per se, but it might be of interest to consider groundwater quality in terms of drinking water standards. I said “per se” because groundwater quality management is context-specific. For example, most states now have optional “lookup tables” to look up the action trigger concentrations for groundwater and soils related to hazardous waste sites. In many cases, this option has greatly facilitated hazardous waste cleanups.

In all cases, environmental standards have been developed by federal and state agencies to be protective of health and the environment. Sometimes, however, often actually, we also use the measurements in a risk assessment - human health and/or ecological - as an added assessment or to override the lookup tables because site specific conditions aren't properly covered by those tables. That's the “is it safe” part.

For years, I used to say risk assessment had 2 parts (are you surprised?) - prediction of how chemicals get to the body and what happens inside the body. But it's really 3 parts - 1) to the body; 2) into the body; and 3) inside the body. EPA and many states have guidance documents regimenting how risk assessment gets performed, and they all boil down to the following:

  • To the body - we predict this using a science we call “fate and transport” analysis. This kind of analysis uses various well-defined approaches to estimating contaminant migration to predict chemical concentrations or mass loadings at the body's intake points (e.g., in air inhaled). There's not a single reference book about this; rather, it is the culmination of what many environmental scientists study and know.
  • Into the body - this is a science we call “exposure analysis.” It considers how various chemicals enter the body, through skin absorption, for example, as well as basic body functions like how much air a person breaths. Reference books for these factors exist, but more important is to pose appropriate “exposure scenarios.” An exposure scenarios is the activity resulting in the exposure to a chemical, a person swimming in a lake once a week for 1 hour each time, for example. Agencies can sometimes go off the edge with insisting on unrealistic exposure scenarios. For example I once had a case where the agency made me consider the risk of PCB exposures to a 1 year old infant 4 ft into stream sediments. I resisted, telling them that PCBs would not be that child's health threat.
  • Inside the body - this is the science we call toxicology. In most cases for human health risk assessment, the toxicity of a chemical is defined by its toxicity value in an EPA database called IRIS (Integrated Risk Information System) and EPA makes you use IRIS. IRIS was compiled and is constantly updated by EPA toxicologists.

Putting all this together, we estimate the potential risks, under the exposure scenarios examined, at the contaminant concentrations predicted. Risk is expressed differently for carcinogens and noncarcinogens. EPA says if the potential cancer risk is greater than 1 in 10,000 or the noncancer risk is higher than its recognized “safe dose” (in IRIS), action must be taken. Two important considerations if you ever need to think about risk assessment results:

  • Risk assessment usually overestimate risks, for a number of reasons; and
  • Consider if the exposure scenarios assumed in the risk assessment apply to you.

One other thing commonly done on the “is it safe” front is to compare measurements to background. In some cases, the measurements might be considered unsafe but they are comparable to background, so there's not much that can be done about it. Arsenic in groundwater in Bangledesh is one example.

So to summarize:

  • We measure the environment with representative sampling to address a specific objective.
  • We organize the data and portray it in ways that are convenient and give insight.
  • We consider the data in terms of spatial distributions and time trends.
  • We compare the measurements to environmental standards and background.
  • We sometimes estimate the potential risks of exposures to the contaminants.
  • We take action if any of this calls for it.

In my general opinion, we know how to measure environmental conditions and how to think about the results to take action if necessary. Whether we actually measure what needs it and take the right action when it's needed is another story, however.

4 - Environmental Regulation

In the US, almost every aspect of the environment is regulated - drinking water, surface water, air, hazardous waste, new construction, nuclear power plants, factories, and much more. So what does it mean to be “regulated.” Generally it means that the government sets quality and sometimes operational standards, requires you to prove you meet them, and can enforce you to meet them if you fail to. Keeping these regulation handles in mind - standards, monitoring, enforcement - I'll summarize for you some of the key environmental regulations. One more piece of background - Congress passes laws and then agencies implement those laws by enacting regulations. I will organize my discussion by key laws. The 4 big ones are the 1972 Clean Water Act, the 1970 Clean Air Act, the 1974 Safe Drinking Water Act, and 2 hazardous waste laws having the acronyms CERCLA and RCRA, which I will explain later. Each law has thousands of regulations.

Clean Water Act

Congress toyed with water pollution control since 1948, but 2 years after Earth Day it passed the first comprehensive federal law, ever, to control water pollution, the 1972 Clean Water Act. Through this law, water pollution is managed by controlling wastewater discharges. Every pipe in the US must have a permit to discharge under Clean Water Act regulations. Each permit is reviewed by the agency in terms of what the receiving water can handle and still meet its designated water quality standards. Every surface water in the US has been “classified,” given water quality standards established based on what's feasible and what is desirable - from drinkable to fishable to waste conveyance, although there aren't many of the latter. The permit system is called by the acronym, “NPDES” (National Pollution Discharge Elimination System”) The process for determining permit limits is called “waste load allocation.” It considers all the discharges and natural purification processes to arrive at allowable limits for each single discharge. The final part is called “stream classification” (although it also applies to lakes and saltwater). Once a surface water is classified, that classification carries with it predetermined water quality concentrations that scientists have determined meet the need of the use.

At the beginning of the Clean Water Act, in the 1970s, there was a huge nationwide effort to build and upgrade wastewater treatment plants. Many wastewaters were still discharged totally untreated at the time. Congress gave an 8 year schedule for this upgrade and made money available to municipalities through what was called the “construction grants program.” Industry was on its own, but it still had the deadline. The wastewater treatment problem turned out to be very complicated for industry because the nature of industrial wastewaters was so variable. By contrast, municipal sewage is fairly predictable; their biggest problem was leaky sewers. All sewers leak. By the 1990s all this had pretty much been worked out, water quality saw some dramatic improvements, and EPA turned to the last big problem for water quality management - stormwater control. Stormwater control is a very difficult problem because stormwater is dirtier than you think and flows are highly variable. The little ponds you see on highway median strips, and hay bales around construction sites are parts of the ongoing, upgrading solution.

The Clean Water Act has many other elements. For example, an acid rain program and phosphate controls to save us from lake eutrophication. In my opinion, the Clean Water Act has generally been a success. Everyone treats their wastewater and surface water quality has improved dramatically almost everywhere in the country. There is still room for improvement, however, and environmental science is still working on it.

The stated goal of the Clean Water Act is “zero discharge.” We are not there and probably never will be. It's not necessary.

Sponsored Content

The Clean Air Act

The Clean Air Act was the first modern comprehensive environmental law. It was passed in 1970. The Clean Air Act established nationwide standards for ambient air quality, called NAAQS (National Ambient Air Quality Standards). It also spurred hundreds of regulations for all emissions, old and new. Air quality management has some unique elements compared to water quality management. For example, air quality is affected by mobile sources, such as cars, and by stationary sources, such as smokestacks. Both are regulated by the Clean Air Act. Air quality is affected by point sources, such as smokestacks, and by what are called fugitive emissions, such as windblown materials off a factory yard or venting dirty air out of factory windows. Also, measuring air quality is more difficult than measuring water quality. On the positive side, to some degree, is the fact that our air is huge which gives us a lot of dilution. Regardless, places with high densities of emissions, such as industrialized cities, still have air pollution problems, and these are called “non-attainment areas” because those areas fail to attain one or more NAAQS. EPA pays close attention to non-attainment areas. There are only 6 NAAQS - ozone, carbon monoxide, particulates, lead, sulfur dioxides, and nitrogen dioxides - but they dominate most air quality concerns ranging from smog to health effects, such as from particulates and lead.

In addition to NAAQS, Congress passed an amendment to make EPA deal with HAPs (Hazardous Air Pollutants), which it does through emission controls rather than by ambient standards.

Implementation of the Clean Air Act is a unique (from a regulatory perspective) partnership between EPA and each state. States develop a statewide plan, called a SIP (State Implementation Plan), after reviewing and limiting emissions from everything in the state, and then EPA approves it or demands more stringency. Emission limits are based on predicted ambient air quality from estimated emission rates with downwind impacts estimated by air quality modeling. As you might guess, this prediction is difficult because the wind doesn't always blow the same way, among a hundred other things. EPA has a standardized air model for emittors to make their predictions. Model predictions may be good to a factor of about 2.

There are many other elements to the Clean Air Act, such as the leaded gasoline phaseout, and in general air quality in the US is pretty good. There remain some problem areas; for example Boston tends to be a non-attainment area for ozone in the summer. And air control can be expensive; for example, I worked on a lead smelter that spent $80 Million before it still had to close due to non-attainment. Throughout the world, thousands of people die of air pollution each year, but perhaps not in the US because of the Clean Air Act.

Safe Drinking Water Act

It is important to understand that if you drink water from your own well or from bottled water, this law does not apply to you. That is about 20% of you.

The Safe Drinking Water Act regulates the 160,000 public water supplies in the US. 84% of those public utilities supply water to less than 3000 people. Many of those utilities use surface water but 32% rely on groundwater. The Safe Drinking Water Act regulates the output quality of those utilities and requires that they keep a certain level of chlorine in their distribution pipes to prevent recontamination enroute to your house. Drinking water chlorination has effectively eliminated water-borne pathogenic disease, such as typhoid, in the US but it comes with a price. Chlorination can create trihalomethanes, which are toxic, so EPA makes water supplies keep an eye on that.

The most important element of the Safe Drinking Water Act is the establishment of federal drinking water standards. This and some of the other important elements are:

  • MCLs (Maximum Contaminant Limits) - MCLs are federal drinking water standards that apply to all public water supplies. There are 90 MCLs, including for specific chemicals, microbes, and properties such as radiation. There are primary MCLs based on direct health effects, which are enforceable, and Secondary MCLs based more on aesthetics, which are not enforceable. EPA develops and updates MCLs by considering the science about them and the economic impact of the potential regulatory threshold. A good example of the economic consideration is the regulatory concept of MCLs and MCLGs, or “Maximum Contaminant Level Goal.” For carcinogens, EPA sets the MCLG at zero, because no consumption of a carcinogen is preferable. However, carcinogens also have MCLs that are higher than zero because zero is not economically practical. The higher value aims at minimizing risk at a reasonable economic cost. The cancer risks of these nonzero values for carcinogens is probably minimal and safe, especially compared to other risks we face.
  • Drinking water treatment plants - EPA subsidizes funding for these systems and requires plant operators to be trained and certified. Most but not all public water supplies are treated. A typical treatment train includes removal of suspended solids and disinfection, with some systems also needing chemical removal using things like activated carbon adsorption - “Britta Filters.” I remind you that a public system is also a network of pipes in addition to the treatment plant.
  • Monitoring and reporting - Public water supplies must test for MCLs at various points in the system at various frequencies, depending on the system. Testing is according to EPA methods in EPA certified labs. Public supplies report to states and states report violations to EPA. MCL violations are enforced (fixed) by the states and EPA. Every community public water supply must prepare a “community confidence report” each July containing information about monitoring results, MCL violations, and health effects. In addition, states prepare a report on all systems each July, which is often available on the internet. Finally, EPA maintains a database on all public water supplies in the nation, called “SDWIS - Safe Drinking Water Information System,” which can be accessed on the website.
  • Source protection - the law requires states to conduct a “SWAP - Source Water Assessment Program” for every public water supply source. SWAPs map protection areas, identify possible contaminant sources, and report to the public. Any actual plans must come from local communities, however, and might require multijurisdictional cooperation because, as you know, rivers don't know city limits.

One of the most interesting developments today in water supply is called “water reuse.” This is getting big attention in dry areas like California. Water reuse involves recycling wastewater for beneficial use - for example, watering your lawn with your washing machine water. The water reuse business considers two types of wastewater - greywater and blackwater. Greywater is everything except your toilet water. The reuse of greywater is currently on understanding its quality, required levels of treatment for target uses, and plumbing (i.e., collection and delivery) change needs. Nice idea, but there are some subtle but significant issues, such as trace metals in water reused for irrigation. We are also trying to recycle wastewater sludges for fertilizer, but there are similar issues. Recycling is a great idea, but we must remember that they call it “waste” for a reason.

In my opinion, drinking water in the US is quite safe. Perhaps sometimes safer than bottled water. EPA studied bottled water years ago and found that 1/3 did not meet federal drinking water standards. If you use publicly-supplied water, look for your July reports and judge for yourselves. Unless, of course, you use a private well, in which case you'll be interested in what I have to say next.

Hazardous Waste Laws

In 1979 we got caught with our pants down at Love Canal, and my field of environmental engineering seemed to transform overnight to an obsession with hazardous waste. Some argued that the Clean Water Act caused our hazardous waste problems by forcing waste burial in order to protect surface water. But the problem was bigger than that. Industrialized America in the 20th Century made a lot of hazardous waste - 66 Million tons/yr of industrial process waste in 1967 - with no regulation and with a recommended practice of burying it in the ground as the safest thing to do. It came back to bite us in the 1980s and Congress responded with 2 laws to manage it:

From your perspective as a neighbor to a hazardous waste site, these 2 laws do the same thing - they are aimed at protecting you. In fact many of the regulatory requirements are similar, such as how cleanups are studied and considered. But RCRA has an important additional element of regulating newly produced hazardous waste in terms of how landfills are designed and operated as well as how factories dispose of their haz waste. Superfund just cleans up old waste.

There are some interesting stories about Superfund. For example, when it was passed in 1980, the first thing it did was to name sites across the country that qualified to be Superfund sites, which was a matter of being placed on what is called the NPL (National Priorities List). It did this with a model called the Hazard Ranking System. Any site ranked above 24.5 on the HRS became a Superfund site. The 24.5 trigger was set so that every Senator could get a Superfund site in his/her district. Currently, there are about 1,800 federal Superfund sites and about 200,000 state Superfund sites. Superfund sites used to cost about $20 Million to clean up and took 20 years. Today, Superfund is turning to cleaning up urban river sediments and these projects cost several Billion dollars. Who knows how long they will take?

The “L” in CERCLA is for liability and Congress gave EPA a large hammer to make potentially responsible parties (PRPs) pay for these cleanups, but when there is no one to nail, EPA pays. Originally, EPA paid out of a “Superfund” loaded with money from a tax on oil and chemicals, but today the fund is broke and Congress won't act to fix it. What else is new?

There is a federal regulation called the NCP (National Contingency Plan), originally established to address oil spills, but adapted to hazardous waste cleanups in 1990. The NCP is the roadmap for CERCLA and RCRA studies and cleanups. The 2 laws and their regulations sometimes use different names for the same thing, but EPA's stated philosophy of the NCP is to ensure “CERCLA quality,” consistent cleanups nationwide through the following basic steps:

  • Remedial Investigation (RI) - Called, the RFI (RCRA Facility Investigation) in RCRA, this is the site study. It may be a string of studies, because looking for a needle in a haystack can often be iterative. But its purpose is to define the nature and extent of contamination. By the way, the term “nature and extent of contamination,” is one developed near the onset of our Love Canal and associated landfill negotiations in the 1980s.
  • Removal Action - sometimes also called an “Interim Action,” this is the set of rules that allows early cleanup of some things that might be especially dangerous prior to making final decisions about the whole site. Cleanup of leaking barrels stacked in an abandoned warehouse is one example.
  • Feasibility Study (FS) - this is the study that examines what to do. Usually it examines a half dozen combinations of remedy elements including “no action.” EPA published an extensive guidance manual for performing what is called the “RI/FS.”
  • Record of Decision (ROD) - responsible parties often perform the RI/FS, but EPA reviews it and makes the final decision which it publishes in the ROD.
  • Remedial Action (RA) - once the decision is made what to do, it is done through what is called the RA. Sometimes more specific field studies are required for this, which are called “Remedy Design Investigations.” An example of an RDI might be sampling to more precisely define “cut lines” for excavation called more generally in the ROD.
  • No Further Action (NFA) - this is the holy grail. When or if EPA issues a No Further Action letter, you're done and presumably the site is clean. Often, however, there are operation and maintenance requirements that might go on for decades. For example a groundwater extraction and treatment program that will take decades to fix groundwater contamination.

Let me move now to the types of remedies that are generally viable for hazardous waste sites. Conceptually, there are 2 types: those that manage the contamination so things get no worse and risks are minimized and those that actually clean up hazardous wastes. Often site remedies are a combination of the 2 because it often is just not possible to clean up everything.

Some of the more common approaches to hazardous site remediation are:

  • Excavation: contaminated soils and chemicals are dug up, thus removing the source of contamination and sometimes the dirtiest impact areas. But what do you do with it after it's dug up? Incineration, thermal desorption, chemical stabilization, landfarming (biodegradation), and offsite landfilling are some options. There are strict RCRA rules for offsite landfilling these days, but if they eventually are not good enough will we see a second cycle of all this? Will we have a Superduperfund?
  • Insitu stabilization: soils and chemical areas can be injected with cement to form monolithic blocks that no longer leach chemicals. This is becoming a popular technology and I hope it works over the long term.
  • Pump & Treat: this is groundwater extraction with the pumped water sent to a treatment plant and then discharged to a stream or reinjected into the ground. Originally, about 90% of groundwater problems were addressed this way, but it is currently used only 30% of the time.
  • Natural attenuation: for many organic chemicals, we have found that the bugs in the ground can eat them. Sometimes we must enhance the ground with nutrients and sometimes we can do nothing other than prove it's happening with the right kind of monitoring, in which case it's called “MNA” (Monitored Natural Attenuation). MNA is used at about 30% of sites now, replacing Pump & Treat, perhaps not so much that it works so well but more in recognition that Pump & Treat can be both technically and economically ineffective.
  • Barriers: this is a management technique rather than treatment. Hazardous wastes or conditions that cannot be treated can still be contained using caps, vertical walls (e.g., sheet piling or trenches filled with cement, called “slurry walls”) and even hydraulic barriers caused by pumping groundwater in certain ways.

MNA and barrier management in a way exempligy the futility we've learned after 35 years of chasing hazardous wastes. For a number of technical reasons that we mostly understand, cleaning up hazardous waste sites is usually not completely successful. We can make hazardous sites safe, however. We can model the risks offered by the final plan and ensure they are minimal, and when all else fails, we restrict the deed on the land so more risky uses are prevented.

I worked on Love Canal at the beginning of Superfund. No one in government or science knew what they were doing about this problem at that time. Much money and time was wasted. But today, we have learned what we are doing and most hazardous waste sites are dealt with effectively, although not necessarily cost effectively. You should generally rest assured from the first part, but be concerned about the second part - wasting money. That is because there is only so much money in the world that will be spent on the environment, so we should spend it wisely. Also, wasted money one way or another comes out of your pocket.

Other Environmental Regulations

Many other environmental and health issues are regulated in the US. For example:

And there are many more. By one estimate, there are hundreds of thousands of environmental regulations in the US, too many to count. This does pose a problem for compliance.

Let me finish this part by telling you what is not regulated that might actually pose a health risk to you. Two important ones are Radon and Chemical Product Safety. Radon causes lung cancer. EPA advises (as opposed to enforces) that 4 picocuries per liter of indoor air is the safe threshold. Here is a map of radon in the ground across the US (fig) and you can see that over half the country exceeds this level in subsurface soil gas. That soil gas infiltrates your home through cracks in your basement slab and often results in radon levels above this “safe” advisory. The rub is whether the advisory level is safe. It is likely that the health risk from that level is greater than a chance of 1 in 10,000 of getting cancer, which is the trigger for cleaning up hazardous waste sites. Said another way, EPA advises that you are safe at a risk level that it is ready to spend a Billion dollars cleaning up if it's in a river. Have your indoor air tested, and your water too, because the radon vapor during showering can offer risks. If your air has radon greater than 4 pCi/l, get mitigation, which involves a pipe through your basement floor running a vacuum to suck the gas out and pump it to your roof. It usually costs less tha $5000 and will get your radon down to about 1 pCI/l which is about the best you'll get. Still not real safe, but better.

The second unregulated issue in this country is chemical exposures from everyday products. In my opinion, this is the largest risk we run from environmental exposures today. I will finish this talk with a discussion on this topic, saving the best for last.

5 - Limitations

Although I believe we are doing the best we can today, it is important to understand the limitations to what we are doing with the environment. The biggest limitation by far is climate change, but that's beyond the scope of this talk. Hopefully we will continue to advance our knowledge and find a way to pay for improvements on the rest while we wait for the planet-wide collapse. The important limitations within the scope of what I've discussed today are, in my opinion:

  • Clean water: Stormwater and, in a few cases, toxic chemicals. In many US cities, stormwater runs through what are called “combined sewers,” a network of both sanitary and storm sewers designed with overflows to surface waters when the network gets overwhelmed with flow surges during storms. Those overflows contain all the nasty stuff on our streets as well as sanitary sewage and sewer sediments. We are working on the problem by separating sewers - storm from sanitary - and by trying to treat the stormwater, but we are not there yet. That's the main reason our rivers and lakes are not really clean yet.

Toxic chemicals are the other water quality limitation in some cases. Bioaccumulation is the storage of toxic chemicals in animal tissue, sometimes in ratios of a million to one. This means that a very small concentration of a chemical in water can bioaccumulate to a quite high concentration in fish tissue and if we eat that fish, it could be harmful. Biomagnification is increased tissue concentration moving up the food chain - big fish eats small fish and its tissue concentration ends up being higher than small fish. More risk if we eat the big fish. This is not a universal issue, but it can be important in some waterways. It is what I call a third order water quality issue. First order - make it so the river doesn't stink. Second order - avoid fish kills and let us swim. Third order - make it safe from toxic chemicals.

  • Safe Drinking Water: Although public water must meet MCLs and MCLs are believed to be safe concentrations, we learn more every day about trace contaminant health effects and must keep ahead of the issue of toxic chemicals. A new issue of concern today is parasites such as cryptosporidium, which sometimes might not be killed by chlorination. EPA and many others are working diligently on these issues, but that doesn't eliminate them now. Not a big deal, but worth keeping track of.
  • Hazardous Waste: It is important to understand that it is often not possible to completely clean up a hazardous waste site. Nor is it necessary, often. The first question is what level of residual is safe, and risk assessment does a fairly good job of answering that. EPA's regulatory preference is to remove and destroy hazardous waste whenever possible, but it's just not always possible. One of the reasons is that most hazardous waste sites have NAPL (Non aqueous phase liquids). NAPL might be petroleum waste, still bottoms tar, spent solvents, and anything else that does not mix well with water. A bottle of salad dressing has two liquid phases - oil- and water-based - so do most hazardous waste sites. The problem is that it is almost impossible to remove all the NAPL and NAPL serves as a highly concentrated source of chemical contamination to groundwater and soil.

EPA has rules for this hopeless situation; called “Technical Impracticability” (TI). The problem is that I don't think they implement this rule properly. Their basic approach is to demand you try first and then apply for a “TI waiver” when you can demonstrate futility. A TI waiver must demonstrate: 1) that it's hopeless to go on; and 2) that management approaches will be used to contain what's left and thus guarantee safety. The problem is that 35 years of Superfund has shown that NAPL leaves at least a hopeless residual in the ground and insisting to try first is a waste of money. Much money and time could be saved by going first to better management approaches instead of removal attempts. In other words, get the TI waiver up front.

Common sense and thinking outside the box must be applied to environmental management every day.


“NIMBY” stands for “not in my backyard” and it is killing progress. That said, who in their right mind wants a landfill next door? That's the conundrum. It takes over 20 years to permit a nuclear power plant, and today people have essentially given up trying. It even took more than 10 years to get windmills approved near Martha's Vineyard. By my estimate, we have about 60 years left for carbon fuel and renewables might offer 20 or 30% of our future energy needs, so how are we going to get nuclear power plants in our backyards? My solution is bribery.

Not really bribery, but think of it as appropriate, offsetting compensation. There are legitimate reasons why someone might not want an environmentally dubious project in their backyard, such as the prospect of health effects and property value diminution. Projects must make it worth the risk, locally. If you want to put a landfill in a town, give them a new park. Or something like that. The park should be an intrinsic part of the landfill project. The key is to figure out what offsetting compensation is appropriate, meaning what will truly offset, for example, the property diminution of the landfill. You must prove to me that the hit taken to my home value will be neutralized by the attractiveness of the offsetting project your project offers.

The health risk potential must also be addressed in ways improved over current approaches. There are 2 parts (as always) - making the risk assessment understandable and offering an appropriate risk-benefit analysis. When the promoters of the Yucca Mountain nuclear waste repository tried to justify it to Nevada, they argued that the nation needed it. What do Nevadans really care about that? The project needed to be justified in terms of benefits to Nevadans. Offsetting compensation meaningful to Nevadans needed to be offered as part of the project. Yucca mountain died because there was nothing in it for Nevada. GE is sending the PCB-laden sediments it is dredging from the Hudson River to a landfill in Texas. The mayor is thrilled because the new landfill became a needed revenue source and an employment vehicle.

The risks of a proposed project must be made understandable to the neighbors. My company once worked on siting a waste incinerator in California. Numerous public meetings filled with risk assessment nerds talking about Gaussian plume models were held. Over their heads, DOA. Not because those people were stupid, but because the science is filled with jargon, deep scientific background, and scientists who never learned to explain things properly. Having gone through it myself, I can tell you that scientific education in the US has a nuance that if you can't be understood, you must be smart.

The bottom line - prove and explain the risks are acceptable and offer direct, local benefits that offset the risks.

Sponsored Content

7 - Product Safety

I will finish this talk with my pet interest - the safety of chemicals in products. Currently there is no effective regulation of chemicals in products and very little information about it. For example, go to the cleaning fluid aisle of your supermarket and look at the labels. You will see, typically, “99% inert ingredients” on the label. That 99% is inert for the claimed performance of the product, but not necessarily inert to your health. One that I dug deeper about was 50% of a carcinogenic solvent. How many cleaning and personal care products do you have in your home? They are vaporizing under your sink right now and you probably rubbed some of them on your skin this morning. Are they safe? You don't know.

I ran a risk assessment consulting firm for 25 years, and my opinion is that the potential risks from chemicals in the products we use is far greater than that from most landfills, factories, or dirty rivers. EPA doesn't want to regulate the inside of your home (thus its radon approach), and you don't want them to, so how should we manage this issue of chemicals from products in our homes?

The first no brainer is to label the products properly. Second is to measure indoor air and exposures, such as in your shower, and assess the risks in a rigorous enough way to capture the many variabilities. Third is to rate products according to their potential for chemical safety. No one does this; the Consumer Product Safety Council does not do this. If such rating did exist, you could make buying decisions on both a product's efficacy and its potential safety. If you had 2 products to clean counters and they both did it well, which would you choose - the safe one or the less safe one? The safe one if you knew one from the other, of course. But you don't know.

The key to a Product Safety Rating system is that it would need to be simple. Just as the Energy Star system lets you pick a 4-star refrigerator over a 3-star if they both have the shelves you need, we need the same level of simplicity to make product safety decisions. That is no easy challenge because the risk assessment issues are complex. But so are the energy efficiency issues and do you even know what's behind that Energy Star system? Do you even know who performs the rating? Somehow we trust it to be true.

One of the huge complexities of evaluating product safety is posing appropriate exposure scenarios. I discussed exposure scenarios earlier under hazardous waste site risk assessment. Same approach here. The secret lies in your smartphone.

Here's what I envision:

  • You zero in to the product you want and see how many stars it has.
  • You forgot what the stars mean, but you assume 4 is better than 3.
  • Alternatively, you want to remind yourself what the stars mean so you scan the barcode with your phone and it gives you an explanation.
  • Next you want to understand if the rating is based on an exposure scenario that applies to you. For example, maybe you only shampoo your hair once a week but the exposure scenario used assumed once a day. So you might ignore the 3 stars because your exposure is less, and assume your use of the product is at the 4 star level. The challenge will be to make the exposure scenario easy to understand for supermarket aisle decision-making.
  • Finally, you balance the product's qualities of performance and safety to make your purchase decision. All this must take no longer than 5 seconds, or I wouldn't use it.

Assuming someone can develop a simple effective rating system the most obvious benefit is that consumers will then be armed with information to make purchase decisions based on safety. But there are other benefits. I believe the existence of such a system will make all products safer, just as the Energy Star system has served to make all appliances more energy efficient. In addition, manufacturers may use it as a marketing tool, reinventing and promoting their safer products. Everyone wins. I hope someone takes the baton to make this happen some day.

I could go on forever about this stuff, but it's time to end. Environmental science and regulation has come a long way, but there are improvements still needed and new frontiers still to open. I hope I've reassured you that in general you are safe with our environment and I hope I've helped you understand how we think about it. The better our understanding, the better our decisions.

Other Lectures by Dr. Neil S. Shifrin

Latest posts by Dr. Neil S. Shifrin (see all)