Getting with the times on scientific advice — deciding by dissensus?

The discussions I heard about scientific expertise and evidence-based policy at the STEPS Symposium in Brighton last month still reverberate in my mind.

The organisers gathered an eclectic group. Several key messages were nicely captured in news and commentary by David Dickson (here, here and here). But one of the ideas that’s kept me intrigued is that put forward by Mike Hulme, professor of climate change at the University of East Anglia (UEA).

Hulme questioned whether consent about scientific evidence is the best way to make decisions. The long-held notion that consent is necessary — because it conveys authority — stems from the rational process of science but may not be serving us well, he said. Just look at the Intergovernmental Panel on Climate Change (IPCC) process and ‘Climategate’ (the 2009 leak of emails between scientists at UEA). Although IPCC reports have gone into great lengths to communicate the uncertainties in evidence, the goal was to present a consensus view about climate change — a goal that ended up being undermined by small errors that, in the bigger picture, changed little about the main conclusions of the assessment. And in revealing a culture closed to criticism, the case of Climategate showed how the need for consensus trickles down to scientists’ everyday work.

The alternative, Hulme said, is to make decisions by acknowledging minority views — with dissensus. Expert elicitation or voting could be explored as ways to make disagreement explicit, he suggested. The audience came back with some valid concerns; for example, asking what this would mean for groups that manufacture doubt to influence public perception; and how such openness could work in practice.

The very real complications that come with such a proposition don’t take away from its value in offering a response to some timely issues around how science functions in society. The time where science was about testing clearly defined questions for a yes-or-no answer is long gone. The questions are now so much closer to the messy business of life; they reflect complex phenomena that no one scientific field can handle alone. And the growing use of science in decisions that affect people’s lives demands a process that can incorporate input from communities more effectively.

In all likelihood, this is nothing new for those who work in the ‘space’ between science and policy. The voices that come from ivory towers and closed-door meetings bemoan the fact that no one understands about uncertainty; and they stress how impossible it is to give that one, straightforward answer that policymakers and the public are craving.

But norms and social perceptions are slow to change — and this applies to the scientific community as much as it does to the pubic and policymakers. To me, what Hulme’s point shows is that the change needs to start at home.

I can cite the example of a blog post in these pages which discussed scepticism at a time when climate change controversies were very much alive. And I can say that it wasn’t easy to write about scepticism in this way, knowing full well that it could be misinterpreted when the word has come to become synonymous with denial.

But as Hulme said at last month’s meeting, the widespread perception that science offers a view of the world which is unchallengeable and converges on a certain answer only has traction in a culture or a society that thinks science is about certainty. So the wider question is about how to open up science, about the role of blogs and spaces for dialogue — why not show science in the making, argument in the making?

Showing that dissenting views have a place in scientific discourse could be one way to proactively bring the business of evidence-based advice up-to-date. It may not be the answer. But it is one way to at least puncture the system in which we are all used to operating and move towards a truly modern way of making the best use of evidence.