Members of the California State Assembly's no-amount-of-regulation-is-ever-enough Democratic caucus recently announced forthcoming legislation to address oil spills. The initiative was precipitated by the November 2007 oil spill in San Francisco Bay by the container ship Cosco Busan.
What we really need is not just more, but more intelligent regulation. However, the federal government has made that difficult.
Accidents that cause oil spills are inevitable as long as they can be caused by human or mechanical failures or the vagaries of weather. One factor that should be controllable is the government's creation of disincentives to developing gene-spliced micro-organisms that could degrade spilled oil. But Draconian federal regulations ensure that the techniques available for responding to these disasters remain low-tech and marginally effective. They include methods such as deploying booms to contain the oil, spraying chemicals to disperse it, and spreading absorbent mats.
At the time of the catastrophic 1989 Exxon Valdez spill in Alaska, there were great expectations for modern biotechnology applied to "bioremediation," the biological cleanup of toxic wastes, including oil. William Reilly, then head of the U.S. Environmental Protection Agency, later recalled, "When I saw the full scale of the disaster in Prince William Sound in Alaska ... my first thought was: Where are the exotic new technologies, the products of genetic engineering, that can help us clean this up?"
He should know. Innovation had been stymied by Mr. Reilly's own agency's hostile policies toward the most sophisticated new genetic engineering techniques. In 1997, EPA issued the regulation in final form, ensuring that biotech researchers in several industrial sectors, including bio-cleanup, will continue to be intimidated and inhibited by regulatory barriers.
The EPA regulation focuses on any "new" organism (strangely and unscientifically defined as one that contains combinations of DNA from unrelated sources) that might, for example, literally eat up oil spills. For EPA, "newness" is synonymous with risk. And because gene-splicing techniques can easily be used to create new gene combinations with DNA from disparate sources, these techniques therefore "have the greatest potential to pose risks to people or the environment," according to the agency press release that accompanied the rule. (That's like arguing that newer, more comfortable automobiles with additional safety appurtenances are actually more dangerous, because people are likely to drive them longer distances.)
But science says otherwise. The genetic technique employed is irrelevant to risk, as is the origin of a snippet of DNA that may be moved from one organism to another: what matters is its function.
Scientific principles and common sense dictate which questions are central to risk analysis for any new organism. How hazardous is the organism you started with? Is it a harmless, ubiquitous organism found in garden soil, or one that causes illness in humans or animals? Does the genetic change merely make the organism able to degrade oil more efficiently, or does it have other effects, such as making it more resistant to antibiotics and therefore difficult to control?
EPA's subjecting new biotechnology to extraordinary regulatory requirements is incompatible with longstanding, widely held scientific consensus that holds gene-splicing technology is essentially an extension, or refinement, of earlier, cruder techniques of genetic modification. We should regulate on the basis of the traits of organisms, not because they contain DNA from different sources.
The evidence against EPA's reasoning and the agency's negative policies toward testing new biotech products is overwhelming. The U.S. National Academy of Sciences has said there is no evidence that novel hazards are produced by gene-splicing or the movement of genes between unrelated organisms. The U.S. National Research Council has observed that use of the newest biotechnology techniques actually lowers the already minimal risk associated with field testing. The reason is that the new technology makes it possible to introduce pieces of DNA that contain one or a few well-characterized genes, in contrast with older genetic techniques that transfer or modify a variable number of genes haphazardly. All this means that users of the new techniques can be more certain about the traits they introduce into the organisms.
EPA's regulation requires costly case-by-case government review of virtually all field trials of gene-spliced micro-organisms. "Naturally occurring" organisms are exempt from this process, however, even if they might foul waterways or pose other serious environmental or public health risks. Moreover, the EPA continues to exempt from review all small-scale field trials of chemicals, including those similar to pesticides and the poison gas sarin.
While EPA's exempt-everything-except-gene-splicing approach can hardly be said to be risk-based, it does manifest a certain logic based on scale: small-scale experiments seldom pose significant safety concerns. Under the EPA's traditional exemption for small-scale trials, R&D has been performed safely for more than a century with thousands of strains of micro-organisms (many of them genetically engineered with older, less precise techniques) for purposes as varied as pest control, frost prevention, artificial snow-making, promoting the growth of plants, mining, oil recovery, cleanup of toxic wastes and sewage treatment.
The bottom line is that organisms crafted with the newest, most sophisticated and precise genetic techniques are subject to discriminatory, extraordinary regulation. Research proposals for field trials must be reviewed case by case, and companies face uncertainty about final commercial approvals of products down the road even if they should prove safe and effective.
Government policymakers seem oblivious to the power of regulatory roadblocks. The expense and uncertainty of performing R&D with gene-spliced organisms have virtually eliminated the new biotechnology from bioremediation. Companies know that experiments using the new biotechnology will meet a wall of red tape and politics, and require vast expense.
Unscientific and regressive regulatory policies have already left a legacy of environmental damage and reliance on inferior methods for the cleanup of wastes. These policies are yet another example of the contempt in which federal environmental regulators hold science, technology and the public interest.
Henry I. Miller is a physician and molecular biologist and a fellow at Stanford University's Hoover Institution. His most recent book is "The Frankenfood Myth."
|