Several months before 11 September, Australian scientists published a paper describing how they had unintentionally created a “supervirus” that, instead of sterilising mice as intended, killed every last one. Could this information help someone to create a human supervirus in the same way?
And in 2002 American researchers described how they had made a polio virus from scratch by mail-ordering bits of DNA. The method could be used to build far more deadly viruses.
These papers are now at the heart of a fierce debate. Are such articles more of a gift to would-be bioterrorists than to civilised science? If so, should they be published at all?
The US has already introduced a barrage of legislation, such as the USA Patriot Act, to restrict access to dangerous pathogens and determine who is allowed to work with them. There are also moves to limit access to unclassified but sensitive information. But what constitutes “sensitive” is the greyest of areas.
“We have a tradition of classified and unclassified information, and we don’t take issue with that,” says Steven Teitelbaum, president of the Federation of American Societies for Experimental Biology. “But this third category of sensitive information is poorly defined and lends itself to arbitrary interpretation.” As a result, he fears that scientists will have to face new layers of government interference.
In February 2002, for instance, scientists funded by the US Department of Defense were told they may soon be required to submit their work for review before publishing it. In addition, the government has removed thousands of pages of unclassified technical reports from its web pages. And in late 2002, the Department of Agriculture asked the National Academy of Sciences (NAS) not to release a report it had compiled on agroterrorism, even though all the material in the report was publicly available.
Before 9/11 and the anthrax attacks, most biologists would never have considered withholding results from publication. Outside of private companies and defence-related projects, the free exchange of information is a cornerstone of scientific culture. So the steps taken by the Bush administration have come as a shock to many researchers.
“For scientific openness, this has been an earthquake, an avalanche and a tidal wave rolled into one,” says Steven Aftergood, who monitors government secrecy at the Federation of American Scientists in Washington DC.
However, John Marburger, director of the US government’s Office of Science and Technology, says federal agents are not about to start confiscating papers. And many scientists do agree on the need for a review of what biological information should be made public, and feel that research whose only application is to make a disease deadlier or easier to deliver should not be published. The problem is that a lot of research could be put to destructive use.
Techniques to deliver therapeutic proteins as sprays or pills, for instance, could also be used to deliver toxins. And research on viruses for gene therapy could help those trying to engineer bioweapons such as a virus that kills populations with specific genetic characteristics.
As molecular biology grows ever more powerful, the risks will only increase. Yet some argue that anything other than the most limited restrictions would not only impede research, but could also help terrorists by tying the hands of scientists trying to devise countermeasures.
There are signs that the US government is backing off from its more extreme position. After widespread criticism, for example, the defence department withdrew the directive requiring its scientists to submit to review before publication. And after discussions with the government, only one chapter of the NAS agroterrorism report was restricted. Rather than the government drawing the line, the White House wants to move to a system where scientists police themselves.
This is what most biologists want too. But there is little consensus on what information to withhold or how to make such decisions. Even the authors of the two controversial papers do not agree on the need for censorship.
Ian Ramshaw at the Australian National University in Canberra, one of the authors of the mouse paper, now leans towards not publishing. “We showed how a gene can make a virus highly pathogenic. That’s not a grey area.”
But Eckard Wimmer at the State University of New York at Stony Brook, whose team created the polio virus, defends both papers as basic research, far removed from practical bioweapons applications. “There are easier ways of making nasty viruses,” he says. “And because something works in a mouse virus, that isn’t a sign it will work for human ones.”
So who should make such decisions? The most popular suggestion is that the editors of journals act as censors. Already, the American Society for Microbiology is asking its 11 journals to “discourage any use of microbiology contrary to the welfare of humankind, including the use of microbes as biological weapons”. All the society’s reviewers will be asked to comment confidentially on the potential for misuse in submitted manuscripts. Editors will then heed that advice when deciding whether to publish.
Alternatively, rather than block papers entirely, editors could simply delete key details on sensitive experiments. But critics say this would seriously degrade the value of the scientific literature, since other investigators could not replicate the work. Also, what happens if authors disagree with a journal’s censorship? They could just submit their paper to other publications.
Another problem with relying on editors to vet papers is that scientific research is often presented first at meetings, and it is also becoming increasingly common for researchers to post interim results on websites. “Leaving it to journals would allow a lot to slip through the net,” says Brian Spratt of Imperial College, London.
Spratt is co-author of a report by Britain’s Royal Society on ways to strengthen the Biological and Toxin Weapons Convention. It says there has to be an appropriate response to the dangers of bioweapons information at every level of academic life, based on a scientific code of conduct.
It recommends that awareness of these issues should be part of every student’s training. And when university committees review research for ethical and safety considerations, they should also consider the security implications. Finally, the report envisions an international advisory panel to set standards and ensure consistency.
But Teitelbaum is wary of plans with many levels of control. “Bureaucracies want to justify their existence. They tend to be overzealous,” he says. Scientists will steer clear of research that attracts such negative scrutiny, he thinks. “The best scientists will not work in areas they can’t publish.”
Teitelbaum also argues that any ban that goes much beyond direct weapons research has rapidly diminishing benefit. While legitimate scientists would abide by a policy of secrecy, determined rogue states or terrorists might still get their hands on sensitive information. “By keeping more secrets we are placing ourselves in more danger,” he says. “The information becomes more likely to get into the wrong hands before it gets into the right hands.”
But Gigi Kwik at the Center for Civilian Biodefense Strategies at Johns Hopkins University says that the measure of an effective system should not be its simplicity. And she is worried, for example, that many scientists are pushing for information restrictions only on about 38 select agents considered the greatest risk. “If that is all we do, then it’s easy to see a terrorist’s next step,” she says. “They will go for number 39, which is still nasty, but didn’t make our list.”
SOURCE: NewScientist.com on the 18th January 2003