The Tricky Business of Regulating Biology

Policies regulating biotechnology are slow to adapt to the fast-evolving innovation of genetic engineering.

originally published June 30, 2016 by Brooke Borel

In 2015, a plant pathologist at Pennsylvania State University successfully used new gene editing technology to delete a relatively small bit of DNA from the genome of a white button mushroom. The result: a mushroom that resists turning brown. The advantages to farmers and distributors seemed obvious, but surely such a biological innovation — one that precisely tweaked the stuff of life itself — would run into a maze of regulatory oversight and examination.

Well, not so much. In April of this year, the USDA said the mushroom would not be regulated because it falls outside the agency’s regulatory role — it only has oversight over engineered plants that have genetic material introduced into their DNA with a plant pathogen, an older technique used to make genetically modified organisms, or GMOs. Direct gene editing doesn’t require a plant pathogen to introduce or change DNA, which means the USDA has no authority over such products.

Similarly, oversight by the Environmental Protection Agency is only triggered by agricultural products engineered to make their own pesticides, which the mushroom does not. And as of this week, the Food and Drug Administration had not published a safety review of the mushroom — a voluntary procedure, anyway.

Biotechnology experts don’t seem particularly worried about the safety of the edited mushroom, and it doesn’t appear to pose major health or environmental concerns. But it’s worth noting that the white button didn’t so much pass a regulatory test as fail to find an agency with the proper authority to administer one — and as such, it begs a simple question: What other biological innovations might slip through these regulatory cracks?

After all, by now, humans have managed to tweak the underlying biology of a growing list of plants and animals, producing everything from high-tech crops to microbes engineered for industrial use. But regulating these entities has turned out to be a complicated affair — particularly because the framework of policies and procedures that would govern advanced biotech were developed long before scientists even conceived of the kind of biological tinkering that produced the white button mushroom.

What’s in place is a framework dating back to 1986, when the federal government decided to regulate new genetically-modified agricultural products under an existing set of policies — many of which were conceived to assess chemical risks from air pollutants, say, or pesticides. Risks are risks, the thinking seemed to be, and the Coordinated Framework for Regulation of Biotechnology was born, with oversight responsibilities delegated to the FDA, the EPA, and the USDA.

Since then, the framework has been updated just once: in 1992.

“The issue has always been the flexibility of the agencies to be able to adapt to emerging technology,” says Todd Kuiken, a senior program associate at the Science and Technology Innovation Program at the Woodrow Wilson Center. “And I won’t even say at the same pace as the technologies are developing, but at least 100 yards behind them.”

Last year, the White House Office of Science and Technology Policy announced a review of the Coordinated Framework, asking these three agencies “to develop a long-term strategy to ensure that the system is prepared for the future products of biotechnology.”

But how do you govern living things, which, unlike chemical compounds, are apt to be so unpredictable and so different from one case to the next? How to capture all the potential risks when new biological entities are introduced to the wider environmental milieu, or when they pass down variations of their genes from one generation to the next? And can regulation itself — notoriously immovable — ever keep up with biology or the blur of human innovation?

According to the OSTP, updates to the Coordinated Framework will be available for public comment this summer. But unless the new guidance grapples with how to best regulate living things and fast innovation, it may well miss the mark.

“If the OSTP review is going to be truly worthwhile and have really a transformative beneficial impact, it should be asking to reexamine whether the agencies even have the right legal authority,” says Anne Kapuscinski, an ecological risk assessment expert at Dartmouth College, referring mainly to the FDA’s authority over genetically modified animals.

“And secondly,” she adds, “they should be reviewing the scientific scope and quality of the risk assessments they are asking for.”

Innovation can certainly help society by growing food more efficiently, protecting us from getting sick, or providing conveniences that simply make life a little easier. But new technologies introduce new risks, which is why the government has regulatory bodies to begin with. The first seeds of the FDA sprouted in the mid-1800s to protect consumers from mishaps in agriculture and medicine; the EPA launched in 1970, partly in response to the infamous legacy of DDT; and the Animal and Plant Health Inspection Service, the part of the USDA that regulates biotechnology, formalized in 1972 with the conflicting mandate to protect both agriculture and wildlife.

In order to protect the public and the environment from a potential hazard, the agencies have to understand that hazard in context. A key policy tool is the risk assessment, a process that proposes to use science to sort out what’s safe and what’s not, and one that is “a wholly imperfect exercise,” says Margaret Mellon, a molecular biologist and independent consultant. “You are never going to surface every possible harm. Nor are you going to be able to assess every possible harm perfectly.”

Still, scientists have worked on providing guidance. In 1983, the National Research Council published a report nicknamed the Red Book, which first codified risk assessment for federal agencies into four steps. First, identify the hazard. Then, determine the dose-response, or how much contact a person needs to have with the hazard in order to see a bad health effect. Then assess the exposure, or how likely a person is to come into contact with it. And finally, determine the actual risk. Other reports followed, including the Blue Book in 1994, which focused on air pollution, and the Silver Book in 2009, which tried to address the holes in existing risk assessments that allowed chemicals to enter the marketplace with little to no oversight.

These reports were written, for the most part, with chemicals in mind, but they eventually served as a framework for assessing risks in living things, too, from GMOs to invasive species. “Risk assessment procedures have largely been derived from and focused on chemical entities,” says Daniel Simberloff, an ecologist at the University of Tennessee Knoxville. “And living organisms have two features that chemicals don’t. One is they evolve in unpredictable ways, and second is they usually have autonomous means of dispersing.”

In other words, a chemical will behave in a fairly predictable way, breaking down into the environment. Organisms have the opportunity to do the opposite: they are able to reproduce.

Norman Ellstrand, an evolutionary geneticist at the University of California at Riverside, puts it like this: “Compare the 24-hour fate of a gram of plutonium to that of a gram of E. coli bacteria under optimal growth conditions. You will have slightly less plutonium and more than a kilogram of bacteria.”

It’s also hard to know how genes might spread, Ellstrand adds. Crops may cross with wild relatives over great distances, while bacteria have been known to swap genes across species, contributing to the spread of antibiotic resistance.

Of course, just because genes can swap among species doesn’t necessarily make them more risky than a product of sexual reproduction. In fact, conventionally grown crops and microbes can do the same thing, and they aren’t regulated at all. Trying to determine which products pose a potential risk if released into the environment, though, is no easy task, and it’ll get no easier as genetic engineering, gene editing, and even synthetic biology — which aims to build new organisms from the DNA up — get faster, cheaper, and easier.

The issue can cut both ways. “The problem is, every time we’ve had to regulate biology since the 1980s, we’ve always picked a chemical precedent,” says Joyce Tait, director of the Innogen Institute at Edinburgh University. According to Tait, this has caused delays in potentially beneficial innovation as regulators have tried to work out how to adapt a chemical-based system to biology. “It’s been true of GMOs, it’s been true of things called biological pesticides — which use microorganisms or complex biological molecules for pest control — that’s had the same problems.”

Rather than a clear set of rules, risk assessments for GMOs usually come from the companies seeking federal approval, and they vary depending on the product, company, and agency.

At the EPA, for example, there is no codified risk assessment, although the process is ongoing, says Chris Wozniak, a biotechnology special assistant with the Biopesticides and Pollution Prevention Division. “We have worked on some rulemaking to try to establish some data requirements,” he says, “and we will be presumably ongoing with that in the future.”

A codified biological risk assessment is certainly possible, but it’s the mishmash of current regulations and risk assessments that’s worrying.

“It’s not that we can’t do biological assessments — we’ve been doing them for a long time. It’s when we piece together such an ad hoc collection of assessments on microbes, plants, animals,” says Jaydee Hanson, a senior policy analyst at the advocacy group Center for Food Safety. “The Obama administration is on the right track, asking for a redo of the Coordinated Framework. The agencies in the main haven’t really wanted to step up to the task, and partly it’s because it’s not an easy task to take on all these technologies with a limited staff.”

Already, the mashing of new technologies into an old regulatory system has led to curious policies. Consider the AquAdvantage Salmon, the first GMO animal approved for food, which scientists genetically engineered to grow faster than its wild counterpart. Rather than the USDA, or even the Fish and Wildlife Service, the FDA regulates the fish. It’s classified as an animal drug.

“They aren’t so naive as to treat the salmon as if it is just a pharmaceutical,” says Peter Jenkins, president of the Center for Invasive Species Prevention. “But the problem is that…it’s being done by an outfit that, while they have some expertise, really doesn’t have the resources or knowledge on regulating a wildlife species like salmon. It’s not a good fit within institutional structure.

There are other examples, too. Microorganisms engineered to make biofuels and other industrial chemicals may be regulated under the Toxic Substances Control Act, primarily intended to oversee chemicals. And then there are gene drives, a new genetic tool that helps to quickly push a specific trait through a population. For example, scientists are working to engineer pathogen-resistant mosquitoes that could breed with a wild population and eventually erase its ability to spread disease. Most regulations try to control the spread of introduced organisms or chemicals; the intention with a gene drive is quite the opposite.

“When you look at it from a legal perspective, you could argue they’re essentially regulating [GMOs] like a chemical,” says Kapuscinski. “But the ecologists and the biologists — and I think even a lot of the staff at the EPA — knew that you can’t really regulate it just like a chemical. And that’s what caused some of the paradoxes and mismatches between the policy and law and regulation, and what the real issues are.”

So what’s the best way to regulate biological innovations? And how should the framework change?

Tait suggests a system-based approach: “I think we need to think quite deeply about the biological entity as a biological system. And how you would regulate a system rather than a molecule.”

Others would like the U.S. to adhere to international food safety standards and risk assessments set by Codex Alimentarius, a collaboration between the World Health Organization and the Food and Agriculture Organization. “We think the U.S. should be leading the world, not at the end,” says Michael Hansen, senior staff scientist at Consumers Union. “And that they should recognize that [new genetic techniques] all do fall under the genetic engineering, and there should be required safety assessments consistent with Codex.”

And a new report from the NRC recommends a tiered U.S. regulatory system, which would focus on the characteristics of a crop regardless of how it was made. The report suggests that crops with novel features should be subject to safety testing, including a full scan of the plant’s genome, or the collection of all of its genetic information, to compare with existing crops or products already on the market.

These tests would presumably catch unexpected differences, including those that pose risks, although the scientists I’ve interviewed who work with gene-editing in the lab say this practice is uncommon mostly due to cost and time.

“Take your plant, look at it, and if there’s no difference, well then you’re in great shape,” says Fred Gould, co-director of the Genetic Engineering and Society Center at North Carolina State University and chairman of recent NRC report. “If you do see a difference, then we’re back to where we have been for years, needing animal testing.”

Of course, another, more drastic approach to address new biological products is to do away with all of the current regulations and start over. “What we would argue that Congress needs to adopt a more plenary, fulsome, appropriate new laws to regulate genetically engineered animals of all kinds,” says Jenkins, “instead of trying to cram them into existing policies and frameworks.”

But is that likely — can we build smart, new, fast-adapting regulations from scratch? When I’ve posed that question to experts, I’ve mostly been met with a laugh.

“No,” says Jenkins. “Not right now.”

Brooke Borel is a New York-based science journalist and a contributing editor at Popular Science magazine. Her work has also appeared in The Atlantic, The Guardian, BuzzFeed News, Slate, and PBS’s NOVA Next, among other publications. She is a 2016 Cissy Patterson Fellow at the Alicia Patterson Foundation.

This article was originally published on Undark. Read the original article.

The Tricky Business of Regulating Biology was last modified: June 19th, 2020 by Staff