How can journal editors keep lethal research out of the hands of terrorists? Donald Kennedy reports
C. P. Snow, in a series of splendid novels, explored the cultural gulf between the sciences and the humanities. The Two Cultures problem of which Snow wrote resurfaces from time to time, as in the science wars debate about whether scientific truth is culturally "situated". That cultural divide has largely disappeared, but it has been replaced by another two-cultures problem that is even more troublesome. The new problem is the separation between the cultures of science and security.
The two are being put in much closer contact as the new war against terrorism proceeds. The community concerned with security has been given a plethora of new things to worry about, and many of these have given it new reason for closer interactions with the scientific community.
It is apparent that certain kinds of scientific information - particularly in microbiology or biomedicine - might, if published, help terrorists or unfriendly states design bioweapons of mass destruction or ways of delivering them. People from the security community would worry if such information was published. Most people from the science community would agree that it should not be published, but agreement is harder to find on how frequently such information is generated in papers submitted for publication.
Under the sponsorship of the National Academies and the Center for Security and International Studies, representatives of the security and the science groups in the US recently spent a day exploring this ground to see how much of it was common. A real effort was made to create dialogue, and it usually worked. But beneath the surface one could sense some tension: scientists perhaps questioning how much the security people knew about the science, and some security people wondering how much the scientists really understood about the danger and about the cleverness of those dubbed the "bad guys".
This same kind of tension surfaces in another sector, through current troubles at the US national weapons laboratories (Los Alamos and Lawrence Livermore), which are managed for the government by the University of California.
Many security-minded critics have attributed administrative failures and security lapses at the laboratories to an academic mindset that overemphasises scientific welfare and underemphasises security.
Critics on the other side are quick to blame episodes such as the fiasco surrounding Wen Ho Lee - who was exonerated of passing nuclear secrets to China - on the over-zealousness of the security community.
But nothing is really new under the sun, and the same conflict emerged in a very problematic way in the early 1980s, when regulations designed to prevent the transfer of weapons specifications were suddenly applied to basic research. That problem almost created a showdown between the two communities - after a debate involving US universities and the Department of Defense, it was eventually solved with the help of some intercession by a report from the National Academies. The report, named for the committee's chair, Dale Corson, found it particularly difficult to deal with "dual-use technologies" - those with highly beneficial civilian uses, yet with some potential for military application.
In a way, that is exactly where we are now, as we contemplate the tasks of authors and editors with respect to publication. The papers that might present some risk of instructing a terrorist or rogue-state plotter are likely also to contain information of value, either to those developing counter-terror strategies or, more generally, to those protecting public health. In that sense, the technologies are dual use, like those that engaged the Corson Committee.
Thus, the issue of publication has to be approached with delicacy, in a way that respects the prospective value of the information as well as its potential risks.
The statement that was published by the joint editors and author group in Science , Nature and the Proceedings of the National Academy of Sciences this week is the outcome of a day-long meeting after the workshop sponsored by the Academies and the CSIS. The participants had a promising base from which to work: the workshop had made good progress in creating understanding and the worst caricatures - "naive geeks" versus "ignorant spooks" - never surfaced.
The editors, authors and others reached a general sense (the word "consensus" might be putting it too strongly) that there was scientific information that should not be published, yet they reinforced the view that benefits and costs had to be considered concurrently in reaching such decisions. The statement is perhaps unremarkable in that it poses no radical policy departure. But it makes good sense, and that's a place to start in getting the two cultures together.
Donald Kennedy is editor-in-chief of Science and president emeritus of Stanford University. This article is taken from his editorial published today in Science .