To disregard expert advice in a misguided pursuit of democracy is naive and potentially dangerous, writes David Ball
It takes a lot to quit a government advisory committee working on a matter of immense national importance. It is especially hard when you feel that it was, as I had said at my interview, the job for which I had been training all my life. But in the spring of 2005, having corresponded with both the chairman of the Committee on Radioactive Waste Management and a minister from the Department for Environment, Food and Rural Affairs, which had set up the body, I decided that the situation was beyond hope.
Resignation was the only way to reduce further waste of public money and to draw attention to the issues.
At the core of my frustration was my feeling that the committee I was a part of was rejecting scientific input in a way that I had not encountered before. Nuclear waste, especially the potent mix accumulated in the UK, is among the most dangerous material in existence if improperly managed. Understanding how best it might be disposed of is a matter for the specialist, surely? But the committee which this week recommended burying that waste underground seemed intent on burying expert opinion throughout its deliberations. Its approach put science second to an extensive programme of public and stakeholder engagement.
Of course the committee and Defra would deny any "science on tap" approach.
Its terms of reference give no hint of any intent to curtail scientific input. But I was not the only one who thought otherwise. Individual experts consulted by the committee complained about how their expert knowledge was compartmentalised and constrained.
In 2005, the House of Lords Science and Technology Committee released a highly critical report on its use of science, while in January this year the Royal Society issued a press release that began: "It is vital that (the committee) obtains stronger scientific input as it moves into the final stages of its work...".
In some circles there is a belief that science is discredited and that scientists cannot be trusted. It is claimed that "science no longer holds any absolute truths", a view that permits relativism to prevail so that, in extremis, "all views (of risk) should be given equal credence as subjective representations of alternative realities".
Someone with such a relativist tendency would promote public consultation above technical input. This attitude would certainly be consistent with comments I regularly heard from fellow committee members to the effect that "no two scientists would give the same answer to any question", along with the intimation that reliance on technical input was somehow mistrustful of the public. The committee then initiated actions that led eventually to the sacking of its only expert on the health effects of ionising radiation.
Soon after, I resigned in protest.
The paradox is that the high moral ground the committee sought to gain by basing its decisions on public opinion appears to run counter to the public's own preference. In 2005, a major opinion survey found that most people would prefer decisions about science and technology to rely on experts' advice about risks and benefits. There are other difficulties too, not least the requirement for comprehensive risk assessments of the options considered by the committee, an approach certainly rejected by the committee when I was a member.
It would seem that science, society and decision-making is at some sort of crossroads. Science, by its nature, is constantly self-critical. It is thereby vulnerable to charges of impermanence. This has opened the door to a new breed of expert who would substitute or diminish science's role, replacing it with public dialogue through focus groups, citizens' panels, juries and the like. This influential band has yet to prove its worth and is enjoying a honeymoon while its adherents experiment with decision-making and democracy.
But given that nuclear waste will have health and welfare implications for thousands of years, is this really the right matter on which to conduct such an experiment? Involving ordinary folk in complex decisions necessitating an understanding, for example, of security from terrorism, of the half-lives of nuclear isotopes and of the permeability of granite is an interesting idea. But before substituting this type of "concern-driven"
decision process for expert judgment, it should first be tested for its reliability. There is such a thing as a duty of care. If we do not feel we owe it to our generation to be more careful, we should at least think of those who will follow us.
David Ball is professor of risk management at Middlesex University. He was a member of the Committee on Radioactive Waste Management until mid-2005.
It takes a lot to quit a government advisory committee working on a matter of immense national importance. It is especially hard when you feel that it was, as I had said at my interview, the job for which I had been training all my life. But in the spring of 2005, having corresponded with both the chairman of the Committee on Radioactive Waste Management and a minister from the Department for Environment, Food and Rural Affairs, which had set up the body, I decided that the situation was beyond hope.
Resignation was the only way to reduce further waste of public money and to draw attention to the issues.
At the core of my frustration was my feeling that the committee I was a part of was rejecting scientific input in a way that I had not encountered before. Nuclear waste, especially the potent mix accumulated in the UK, is among the most dangerous material in existence if improperly managed. Understanding how best it might be disposed of is a matter for the specialist, surely? But the committee which this week recommended burying that waste underground seemed intent on burying expert opinion throughout its deliberations. Its approach put science second to an extensive programme of public and stakeholder engagement.
Of course the committee and Defra would deny any "science on tap" approach.
Its terms of reference give no hint of any intent to curtail scientific input. But I was not the only one who thought otherwise. Individual experts consulted by the committee complained about how their expert knowledge was compartmentalised and constrained.
In 2005, the House of Lords Science and Technology Committee released a highly critical report on its use of science, while in January this year the Royal Society issued a press release that began: "It is vital that (the committee) obtains stronger scientific input as it moves into the final stages of its work...".
In some circles there is a belief that science is discredited and that scientists cannot be trusted. It is claimed that "science no longer holds any absolute truths", a view that permits relativism to prevail so that, in extremis, "all views (of risk) should be given equal credence as subjective representations of alternative realities".
Someone with such a relativist tendency would promote public consultation above technical input. This attitude would certainly be consistent with comments I regularly heard from fellow committee members to the effect that "no two scientists would give the same answer to any question", along with the intimation that reliance on technical input was somehow mistrustful of the public. The committee then initiated actions that led eventually to the sacking of its only expert on the health effects of ionising radiation.
Soon after, I resigned in protest.
The paradox is that the high moral ground the committee sought to gain by basing its decisions on public opinion appears to run counter to the public's own preference. In 2005, a major opinion survey found that most people would prefer decisions about science and technology to rely on experts' advice about risks and benefits. There are other difficulties too, not least the requirement for comprehensive risk assessments of the options considered by the committee, an approach certainly rejected by the committee when I was a member.
It would seem that science, society and decision-making is at some sort of crossroads. Science, by its nature, is constantly self-critical. It is thereby vulnerable to charges of impermanence. This has opened the door to a new breed of expert who would substitute or diminish science's role, replacing it with public dialogue through focus groups, citizens' panels, juries and the like. This influential band has yet to prove its worth and is enjoying a honeymoon while its adherents experiment with decision-making and democracy.
But given that nuclear waste will have health and welfare implications for thousands of years, is this really the right matter on which to conduct such an experiment? Involving ordinary folk in complex decisions necessitating an understanding, for example, of security from terrorism, of the half-lives of nuclear isotopes and of the permeability of granite is an interesting idea. But before substituting this type of "concern-driven"
decision process for expert judgment, it should first be tested for its reliability. There is such a thing as a duty of care. If we do not feel we owe it to our generation to be more careful, we should at least think of those who will follow us.
David Ball is professor of risk management at Middlesex University. He was a member of the Committee on Radioactive Waste Management until mid-2005.