Frack the Precautionary Principle

By Mike Dorf


At the end of last week, the Cornell Environmental Law Society hosted an extremely well-organized and well-run conference (with over 500 people in the audience) focused on energy policy in general and in particular, on questions arising out of proposals to permit "hydrofracking" of the Marcellus Shale in New York State (and elsewhere in the region).  Hydrofracking or simply "fracking" is short for "hydraulic fracturing," a process by which natural gas is extracted from permeable rock by blasting large quantities of sand, water, and chemicals through it.  The process is controversial because of the risk that the chemicals will find their way into drinking water and cause other environmental damage.  Industry, meanwhile, argues that it is safe and that natural gas is, on the whole, less damaging to the environment than other means of obtaining and using fossil fuels, such as coal mining and drilling for oil.  Here is a useful website that highlights the risks and here is one that argues in favor of fracking.  The conference featured speakers representing a wide spectrum of opinion on the question.

I was the moderator for one of the sessions, called "Law & Science: A Convergence in Energy Law & Policy."  The aim of this panel was to step back a bit from the immediate question of fracking and explore what it means to say--as many of the participants in the conference said--that sound policy should be based on solid science.  Here I'll give a very brief summary of the three presentations and then focus on a point I made during the Q&A.  There were three speakers on the panel.

Kevin Haroff, a leading environmental attorney in private practice in California (whose caseload does not include companies currently seeking to hydrofrack but who has, in his career, represented oil and gas companies), explained that sound science these days generally means "probablistic risk assessment" or PRA: A regulator or regulated entity attempts to imagine everything that can go wrong, does its best to assign a probability and severity to that risk, and then plan accordingly.  He described the ongoing disaster at the Fukushima nuclear power plant as partly due to a failure to account for the risk of a simultaneous very large earthquake and very large tsunami.  (This is odd, given that tsunamis are caused by earthquakes.)  Haroff concluded by urging that regulators incorporate the precautionary principle in their PRA.  The precautionary principle--which is widely in use in Europe and used to a less substantial degree in American law--holds that where some untested technology or other course of action holds out a potentially large but currently unknown risk, the burden of proof should be on the proponents of the technology or course of action to prove that it is safe, rather than on the regulator to prove that it is unsafe, before proceeding.


My colleague Ted Eisenberg warned about the dangers of industry-funded science.  He cited meta-studies that show that industry-funded studies are substantially more likely to conclude that their technology is safe than are truly independent studies.  His chief example involved Lipitor, a cholesterol-lowering drug that has been shown to be effective at lowering the risk of heart attacks in men but has not been shown effective in women.  Eisenberg noted how Pfizer, the maker of Lipitor, conducted a study to test its effectiveness in women, but never released the results of that study.  He explained that this practice is common in medicine and it would be very surprising if it were not true across the range of regulated industries: Thus, industry-funded science should be taken with a large grain of salt.

My former colleague Brad Karkkainen sketched three main approaches to the uncertainty inherent in scientific evaluation of proposed new projects: 1) Proceed unless risks are shown to be unacceptably high; 2) Do not proceed unless risks are shown to be acceptably low (i.e., apply the precautionary principle); or 3) Adopt what is sometimes called "adaptive management," which means do your best to evaluate risks ex ante, and if they are acceptably low, provisionally proceed, but be prepared to change course or to adopt mitigation measures as you learn more during the course of the project(s).  Brad urged adaptive management, at least where the risks at issue are not irreversible.  He noted, however, that often adaptive management has been abused: Regulators subject to industry capture skimp on the front-end research and end up green-lighting projects that never should have been green-lighted or promise to make later changes if evidence of trouble emerges but do not follow through on those promises.  Brad nonetheless argued that a sufficiently committed regulator could use adaptive management effectively.

All in all, these were three excellent and informative presentations.  My main intervention was aimed at the precautionary principle, which I favor where possible.  Nonetheless, I asked whether it makes sense to speak about the precautionary principle when there is no possibility of doing nothing.  To my mind, the precautionary principle is a generalization of Hippocrates' view that the physician should "first do no harm."  That makes sense in many contexts: E.g., if you have a stomach ache, and someone offers you an untested drug for it, you should not take the drug if you're not confident it's safe because the drug could do more harm than good.  Likewise, if you're heating your cave home through geothermal energy and someone proposes hydrofracking so you can move into a less energy-efficient home heated by natural gas, you should be very cautious because the hydrofracking could poison your water.

But we currently find ourselves in a situation in which all large-scale energy sources have large unknown risk factors.  Recent disasters in coal mining, oil drilling, and nuclear power, and the evidence of global warming all show that each of our primary sources of energy could lead to catastrophic harm, and because the technology and sources of each are constantly changing, the risks of each are currently unknown.  True application of the precautionary principle might mean that we rely only on power sources that are manifestly inadequate to power an advanced civilization.

When I posed this worry to the panel, Kevin responded that one can use the precautionary principle to inform comparative risk assessments.  I agree and disagree.  Assuming that conservation and proven clean energy sources are insufficient to power our civilization, we should make our comparative judgment with the best information we can about comparative risks.  That was what Neil proposed a few weeks ago in discussing the Fukushima disaster (although I find myself less able to draw conclusions than he is about what our least bad options are).  But I'm not sure that comparative probablistic risk analysis--which I agree is desirable--is an application of the precautionary principle.  A true application of the precautionary principle would probably mean going back to pre-industrialization levels of energy production and consumption--which itself would violate the precautionary principle in other domains were it to lead, as it well might, to mass starvation.

In the end, then, I find the precautionary principle attractive where we have the luxury of following it.  I just don't know that energy policy is such a domain.