April 15, 2012 (Vol. 32, No. 8)
Critical Debate Should Not Be a Monologue, Laypeople Need to Weigh In with Concerns
Since last December, flu scientists have been locked in a battle over two papers reporting successful bioengineering of the H5N1 flu virus. H5N1 (usually known as bird flu) is incredibly deadly to humans, but almost completely unable to transmit from human to human—which many think is the only reason we’ve been spared a catastrophic H5N1 pandemic.
The two research teams tinkered genetically with H5N1 to produce two new strains that are transmissible in ferrets and thus potentially transmissible in humans.
The battle focuses on whether the two papers should be published with their methodologies intact, and on whether further research along the same lines should be permitted.
Scientists on one side are worried about research autonomy and censorship, and excited about the possibility that publication and continuing research could lead to breakthroughs that might help prevent or prepare for a pandemic. Scientists on the other side are worried about laboratory accidents and human malevolence, fearful that publication and continuing research could actually launch a pandemic even if nature itself does not.
One issue in this battle is what role the public should play in these decisions.
Reassuring the Public
In response to the furor, flu transmission scientists organized a moratorium on their own research, aimed at calming the waters and buying time to make the case for unfettered research and publication. Soon after, the World Health Organization convened a meeting, mostly of influenza researchers, which predictably concluded that research and publication should be unfettered. But the group acknowledged that a pause was needed to allow time to reassure the public.
The “public” that has followed these events is tiny. Most people aren’t worried about an H5N1 lab accident or terrorist attack. They’re even less worried about an H5N1 natural pandemic. Convincing people that the H5N1 natural pandemic risk is alarming is a tougher and more important task than convincing people that the H5N1 terrorism risk is less so.
But let’s take the WHO conferees at their word and assume the job is to reassure the public that it’s okay to publish the two papers and resume H5N1 bioengineering research. What are proponents doing wrong in their effort to reassure the public? I’ll focus on just two (of many) issues: education and contempt.
Education Won’t Do the Job
I worry that advocates of unfettered H5N1 research and publication want to “educate” the public out of its concerns. That almost never works. In risk communication and planning literature, this strategy is called “decide–announce–defend”: Figure out what to do; then tell the world that’s what you’re going to do; then rebut any and all objections with a mix of technical data and dismissive rhetoric. This is a thoroughly discredited approach.
Decide–announce–defend is especially unlikely to work when serious risks are involved. “How safe is safe enough” is a values question for society, not a science question for experts who have a horse in the race.
The dangers of concocting a potentially deadly pandemic virus in the lab are obvious. The benefits of doing so are less obvious. (Phrases like mad scientist come easily to mind.) So the burden of proof is on those who wish to assert that this is a sensible thing to do. Before making their case, they must first “own” the burden of proof, listen respectfully to people’s concerns, and join in a collaborative search for a potential compromise. Arrogant and self-serving rants about censorship won’t help.
H5N1 bioengineering researchers are essentially supplicants, asking everyone else for permission to carry out work with huge (but unquantifiable) potential risks and huge (but unquantifiable) potential benefits. I doubt that’s how they will address public concerns —as a supplicant—but it’s how they should.
Some of my corporate clients use the term “social license to operate” to capture their hard-won realization that they can’t do what they want to do if the public doesn’t want them to (and that that’s how it should be). Science, too, needs a social license to operate. The first step in securing your social license is acknowledging that you need it: supplicant, not educator.
Contempt Makes It Worse
Experts understandably have a hard time being respectful of interfering laypeople. But scientists’ visible contempt for the public’s concerns actually increases the risk of such interference.
Everyone (including me) agrees that it was a good move when H5N1 researchers declared a moratorium. But even the Nature letter (www.nature.com/nature/journal/v481/n7382/full/481443a.html) announcing the moratorium dripped with disdain.
Consider this over-reassuring sentence:
“Responsible research on influenza virus transmission using different animal models is conducted by multiple laboratories in the world using the highest international standards of biosafety and biosecurity practices that effectively prevent the release of transmissible viruses from the laboratory.”
Nothing can go wrong … go wrong … go wrong…. I don’t have space to document all the lab accidents that have released transmissible viruses. A 1977 lab accident is thought to have released the human H1N1 flu virus, which had not circulated since 1957; it spread globally for the next 32 years. As for the risk of an intentional release—and the systematic underestimation of that risk inside the flu world—see my article on “A Blind Spot for Bad Guys” at www.psandman.com/col/H2N2.htm.
Here’s a worse example:
“Despite the positive public-health benefits these studies sought to provide, a perceived fear that the ferret-transmissible H5 HA viruses may escape from the laboratories has generated intense public debate in the media on the benefits and potential harm of this type of research.”
Note the extraordinary lack of parallelism. We usually contrast benefits with risks —or if you prefer, potential benefits with potential risks, or even perceived benefits with perceived risks. These are all parallel formulations. But the Nature letter doesn’t contrast the confidently asserted “positive public-health benefits” with risks … or with concerns about those risks … or even with fears about those risks … but with something much more ephemeral: a mere “perceived fear.”
As used by these scientists, public “perceptions” are misperceptions, and public “fears” are unjustified fears. If the insult here escapes you, think of a risk you take seriously and imagine someone labeling it a perceived fear.
Paradoxically, this contempt for public concerns might actually provoke stricter regulation of science. If scientists are nasty and myopic enough when claiming that only scientists’ opinions matter regarding what they do and what they publish, society might rebel against such unbridled scientific autonomy. It’s unlikely. Most people have a strong conviction that governments don’t know how to regulate scientists and we’re better off leaving them alone. That autonomy has nurtured a lot of scientific arrogance, but the arrogance hasn’t yet undermined the autonomy, and odds are it won’t this time either.
But if there’s a threat to scientific autonomy, it’s not coming from those questioning the wisdom of the two studies. It is coming from the arrogant, scientifically dishonest, risk-insensitive way some scientists are responding to the questioning.
What I’d Say
The H5N1 debate isn’t a monologue. Especially for the side that wants to publish the two papers and carry on, listening is more important than talking. Validating the other side’s concerns is more important than talking. Implementing some of the other side’s recommendations for additional biosafety and biosecurity measures (and giving them credit for the improvements) is more important than talking.
But when the time comes for talking, here’s what I’d say:
“This is uniquely dangerous research, so much so that it has stimulated an extremely unusual push to regulate scientific research and publication. If we’re going to do such research at all, we need to prove that we’re taking safety and security seriously, we need to implement more precautions, and we need information about those precautions (and all infractions) to be publicly available. Moreover, we need to prove that the research is important enough to justify taking the sizable risks.
“This isn’t about research autonomy generally. It’s about whether it makes sense to create a possible monster in our labs in order to do research that might (or might not) have huge payoffs in preventing or fighting the natural monster that could emerge at any time. The research we’re proposing to do is only part of a coherent agenda to address the risk of a potentially catastrophic H5N1 pandemic, an agenda that includes the following other priorities…”
Peter M. Sandman ([email protected]) is a risk communication consultant based in Princeton, NJ. His writing on risk communication can be found at www.psandman.com.