Q&A | Bird flu research and the limits of scientific censorship
A controversy over scientific freedom and the free flow of information has erupted after the U.S. government asked scientific journals not to publish details of studies on the bird flu virus.
The unpublished studies show how scientists mutated the virus to the point where it became highly transmissible in ferrets, which are considered the best indicator of how flu viruses might behave in humans.
The U.S. government was apparently concerned this information might get into the wrong hands as a kind of biological weapon.
CBCNews.ca spoke with Dr. Peter Singer, of the University Health Network and the University of Toronto, about the case and censorship in scientific research.
Five years ago, Singer was a member of a panel of the U.S. National Academies and the Institute of Medicine that produced a report called Biosecurity, Globalization and the Future of the Life Sciences.
"In that report, we said that science, the free flow of information in biological science, was critical and the benefits of that vastly outnumbered the harms," Singer said Wednesday.
"The cases of limitation of free flow of scientific knowledge should be as rare as hens' teeth."
CBCNews.ca: Is it unusual to see a government stepping in as it has in the bird flu case?
Dr. Peter Singer: The core value of science is truth and so whenever you see a case of censorship of science, it should be highly unusual. In fact, every couple of years, there is an example of a research experiment, sometimes intentional, sometimes unintentional, that has implications for bioterrorism, that could be misused.
A classic one was on polio — construction of the polio virus in a laboratory. There was another one digging up the 1918 flu strain.
But it is and should be very unusual when science is censored, because the free flow of science, the free flow of information, honesty, integrity and truth, are core values of science.
Is the research at the heart of this debate fairly routine?
It's not way out there in terms of the methods. The details have not been published, but my understanding is that it's genetic engineering of an H5N1 flu virus and the net result is that, at least in ferrets, it passes more ferret to ferret and it's more lethal, about 50 per cent of the time.
Those are the two things you have to keep in mind with flu: how easy is it for it to go from person to person — that has to do with spread — and how dangerous is it once it infects you — that has to do with lethality. And this one, at least in ferrets, seems to have both.
So that does make it an experiment that's worth paying attention to, or as U.S. policymakers have called it over time, "an experiment of concern."
To what degree is there a threat of bioterrorism?
This is definitely a bit of information that could be used for good or ill.
It's a good question as to the motivations and capabilities of people who would want to commit bioterrorist acts, but at least there's enough circumstantial evidence of motivation — al-Qaeda tried to recruit microbiologists, the findings of scientific journal articles in al-Qaeda hideouts — that one really wants to be cautious in regards to issues of bioterrorism and biosecurity and advances in scientific knowledge.
Having said that, it should be extremely rare and unusual, if ever, to censor the findings of scientific experiments.
So how do you strike a balance?
In this particular case, there's actually a well-established structure and it's called the NSABB, the National Science Advisory Board for Biosecurity. That's an advisory body to the U.S. government, which has been set up among other reasons to assess and advise on experiments like this.
It's extremely important that there is a clear, transparent and straight-forward process for cases like this because the worst thing is the sort of grey, fuzzy self-censorship that scientists can do or non-transparent censorship that governments could do.
The NSAAB decision in this particular case — you could almost think about as Solomonic, because what they were choosing between was publish the whole thing as is, as it would normally be published on the one extreme [and] on the other extreme, censor it all. And their decision was to publish the conclusions and the public health implications but to leave certain details of the methods unpublished
I think that is a Solomonic decision. It does strike a balance, because this is an experiment that I think could be misused.
Essentially you've got two sides to the argument here.
The argument for publishing everything is that that information is valuable to the good guys to set up defences and vaccines and other types of biological defences against the misuse of this, not to mention vaccines for positive uses, as well.
On the other hand, the argument against publishing the details is that some of these details could constitute a recipe for bioterrrorism.
And you don't want to have that information to fall in the wrong hands and so there are very strong arguments on both sides.
When you have a Solomonic decision like that, does trying to satisfy two competing interests end up not satisfying any?
You definitely don't fully satisfy any. You don't fully satisfy those who would argue for the openness of science without any censorship of science whatsoever under any circumstances and that is a pretty powerful argument because of the importance of openness and transparency as a core value of science.
We've got lots of historical examples where even a slight undermining of that core value has led to very serious abuses in the name of science.
Which one would come to mind first?
Whether that's psychiatry in the Soviet Union, whether it is state-sponsored biological warfare programs of various countries, whether it is the misuse of science in the Holocaust, these are very extreme examples and obviously there's a slippery slope argument that you have to believe.
But what I'm trying to emphasize by using these examples is just how important a value openness, transparency of science, is and how censorship should be resisted.
Do you think given what we're seeing the U.S. government saying and what we have seen from the NSABB, is it black and white enough, or is a little bit grey right now?
I think it's black and white enough. You have a transparently established structure whose members are known, who are making a public opinion or recommendation to journal editors and to this researcher who will then govern themselves accordingly. I think that this does constitute a transparent process.
If you are a scientist doing cutting edge research in these areas, is it better to publish, or keep it under wraps? What should you do with your research?
It's better to follow the science to where it leads, but at the same time to stay atuned to the potential use of the research and, in areas of questioning, to do what this researcher eventually did, which was to have this question asked as an open question on a transparent process.
One of the other issues here is the need for awareness on the part of scientists and graduate students about the potential uses and misuses of the knowledge they create.
But at the same time, science is a full-court press on the truth, and so no scientist or no graduate student should take a decision like this as anything other but a full encouragement to follow the path of science and their questioning wherever it leads.
This interview has been edited and condensed.
With files from The Canadian Press