Skip to main content

Can Scientists Convince the Public to Accept CRISPR and Gene Drives?

Scientists are trying new ways to win over a skeptical public

Credit:

Jörn Kaspuhl

In 1999 Robert Shapiro, then head of Monsanto, gave a stunning mea culpa at a Greenpeace conference in London. Monsanto’s first genetically engineered (GE) crops had been on the market for only three years, but they were facing fierce public backlash. After a botched rollout marred by lack of transparency, the company, Shapiro said, had responded with debate instead of dialogue. “Our confidence in this technology … has widely been seen, and understandably so, as condescension or indeed arrogance,” he said. “Because we thought it was our job to persuade, too often we’ve forgotten to listen.”

The damage was already done. Fifteen years later only 37 percent of the U.S. population thought that GE foods were safe to eat, compared with 88 percent of scientists, according to the Pew Research Center. Regulatory bodies in the U.S. fought for years over whether and how to label GE foods. In 2015 more than half of the European Union banned the crops entirely.

Science doesn’t happen in a vacuum. But historically, many researchers haven’t done a great job of confronting—or even acknowledging—the entangled relation between their work and how it is perceived once it leaves the lab. “The dismal experience we had with genetically engineered foods was an object lesson in what happens when there’s a failure to engage the public with accurate information and give them an opportunity to think through trade-offs for themselves,” says R. Alta Charo, a bioethicist and professor of law at the University of Wisconsin–Madison. When communication breaks down between science and the society it serves, the resulting confusion and distrust muddies everything from research to industry in vestment to regulation.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


In the emerging era of CRISPR and gene drives, scientists don’t want to repeat the same mistakes. These new tools give researchers an unprecedented ability to edit the DNA of any living thing—and, in the case of gene drives, to alter the DNA of wild populations. The breakthroughs could address big global problems, curtailing health menaces such as malaria and breeding crops that better withstand climate change. Even if the expectations of CRISPR and gene drives do come to fruition—and relevant products are safe for both people and the environment—what good is the most promising technology if the public rejects it?

“Without transparency, we might see a kind of hyperpolarization,” says Jason Delborne, a professor of science, policy and society at North Carolina State University. Concerned groups will feel marginalized, and advocates won’t receive critical feedback needed to improve design and safety. “This puts the technology at risk of a knee-jerk moratorium at the first sign of difficulty,” he notes.

Credit: Amanda Montañez; Source: “What Is the ‘Science of Science Communication'?’ by Dan M. Kahan, in Journal of Science Communication, Vol. 14, No. 3. Published Online August 25, 2015

To avoid that outcome, some researchers are taking a new tack. Rather than dropping fully formed technology on the public, they are proactively seeking comments and reactions, sometimes before research even starts. That doesn’t mean political and social conflict will go away entirely, Delborne says, “but it does contribute to a context for more democratic innovation.” By opening an early dialogue with regulators, environmental groups and communities where the tools may be deployed, scientists are actually tweaking their research plans while wresting more control over the narrative of their work.

Take evolutionary geneticist Austin Burt. In 2003 he published the first theoretical paper on GE gene drives. Shortly after, with funding from the Bill & Melinda Gates Foundation, he and his colleagues launched a research project to see if gene drives could control Anopheles mosquitoes, which spread malaria. Back then, in the pre-CRISPR days, the technology was so speculative that doing outreach “didn’t seem worth taking up people’s time,” Burt says. Now that a working gene drive may be ready for regulatory assessment within five years, it’s essential to talk to communities where the technology may be deployed, he adds, “so we can make things that are going to be acceptable not just to regulators but to the public at large.”

This push for reflection has especially come from those wielding the checkbooks. In 2016 the National Academies of Sciences, Engineering, and Medicine published Gene Drives on the Horizon: Advancing Science, Navigating Uncertainty, and Aligning Research with Public Values. The sponsors—various federal agencies, the Gates Foundation and the Foundation for the National Institutes of Health—specifically asked for comprehensive recommendations on ethics and public engagement, says Keegan Sawyer, project director for the report. Other National Academies reports have included these elements, but the combination that appeared in the gene drive report was “unusual,” Sawyer says.

DARPA is among those listening to the guidelines. Its Safe Genes initiative, which will fund seven research projects aimed at understanding how to deploy and control gene drives, requires all its projects to have thorough public engagement plans. One DARPA grant recipient is a team at N.C. State, which includes Delborne. He is overseeing social engagement on a gene drive project that aims to remove invasive mice from remote islands to protect seabirds and other wildlife. Although the research is underway, Delborne says the partners “have been really clear since the very beginning that if people reject this technology for ethical reasons or because there are concerns about the risks—even if the scientists don’t see it that way—there is essentially a pathway to no.” Simply put, the scientists are willing to halt the project.

On the even more extreme end of this trend is Kevin Esvelt, an evolutionary engineer at the Massachusetts Institute of Technology. He’s considering genetic technologies to alter wild mice so that they cannot carry and spread the pathogen that causes Lyme disease. In 2016, before starting any work in the lab, Esvelt visited Lyme-plagued Nantucket, Mass., to gauge if residents would be interested in genetic approaches—including gene drives, although he advised against this option because he doesn’t think it is suitable in this case. Nantucket followed Esvelt’s lead on gene drives, although the community is exploring the possibility of an alternative technology to immunize mice against the pathogen.

Esvelt was addressing head-on a special ethical quandary of gene drives, which are designed to spread and persist in the shared environment: Who should get to decide whether and how to use such technology? “To me, it is mind-boggling that we got so much attention just for going to the communities before we did anything else,” Esvelt says. “I think that says something about how science is typically done.”

Whether the emergence of these efforts will reduce fear and skepticism “depends on how responsive the people listening to the engagement are to those concerns,” says Jennifer Kuzma, who is co-director of the Genetic Engineering and Society Center at N.C. State. In other words, researchers must be willing not only to hear the public’s confusion and pushback but also to adapt—even if that means shelving a technology they think could change the world.

Brooke Borel is articles editor at Undark magazine and author of Infested: How the Bed Bug Infiltrated Our Bedrooms and Took Over the World.

More by Brooke Borel
Scientific American Magazine Vol 317 Issue 4This article was originally published with the title “Message Control” in Scientific American Magazine Vol. 317 No. 4 (), p. 68