Science and policy: keeping it real
The fallout from the recent debate in the UK about reclassification of illegal drugs has a lot to do with politics, but I’m wondering instead whether it tells us anything about how science is used and perceived.
For those not familiar with the incident:
In late October the UK government dismissed the chair of the Advisory Council on the Misuse of Drugs, an independent body charged with giving science-based advice. The dismissal came after publication of Dr. David Nutt’s view about the relative harm from use of illegal drugs. He said that evidence suggests alcohol and tobacco do more harm than LSD, ecstasy and cannabis. This goes against the government’s position.
In the discussions and protest that followed, and no doubt continues less publicly, Radio 4 aired an interview with Professor Robert Winston. In the course of the interview I heard Winston say science isn’t about certainty, but about probabilities.
This got me thinking about two things: the use of scientific advice in policy, and how science is perceived.
Starting with the latter: I wonder how many people think of science as anything other than a discipline that gives answers; or that has failed to give answers. Either way, certainty is presumed.
It could be that scientists, or science communicators, perpetuate this notion that science is there to give knowledge about what people are concerned or curious about; to answer fundamental questions.
Not much about this mission statement can be denied. But isn’t this attitude anachronistic? Maybe science was about certainty centuries ago, when in Europe it became a source of answers, satisfying the basic need for certainty that most people have. But as time goes by the questions science seeks to answer have become more complicated. When you talk about health, for example, you have to consider a system with so many components, from genes and proteins at the micro level to the physical and social environment at the macro level.
Scientists may well understand that this means answers inevitably come with shades or gray — assumptions, conditions, probabilities. But how upfront are we, as practitioners or communicators, about relaying this to the public? We talk about statistical tests, x linked with y — framing answers in black and white. We can see the shades of gray in the limitations, study power, missing variables, imperfect data, etc.
Maybe it’s time science is communicated in a way that captures these shades. The public can handle it: we deal with probabilities all the time in our everyday lives, and make our own choices that come with complexity.
If this is one way of keeping science ‘real’, there might be another way of getting there — though probably more difficult and more controversial. This is about social factors and policy options becoming more explicit in, and more integral to, scientific analysis. Which goes back to what the UK debate was about initially, to me: the role of science advice in policy.
One might say that an official would expect what a member of the public expects: a straight answer. To classify things, rank things, allocate resources based on priorities and perhaps inevitably, politics. When it was proposed that politics might be taken into account in epidemiology in a paper I covered [PDF] back in April, what emerged was a catch-22 of no funding, no proper methods to do it. And this was amongst people who believe it’s worth pursuing. Because there’s also the argument that factoring in social phenomena would sacrifice the objectivity on which science depends.
Perhaps it’s true that it would open the Pandora’s box of subjectivity. But ignoring the realities of policy decisions is also problematic — naive, the advice ultimately incomplete.
Is it really possible to give evidence-based advice in a way that takes notice of these concerns but doesn’t compromise the science? If I give in to propose an answer, I’d say a middle ground can be struck with communication.