Brian Wynne is Professor of Science Studies and Research Director of the Centre for the Study of Environmental Change at Lancaster University. He has a first-class degree and PhD in materials science from Cambridge University, and a M.Phil in sociology of science from Edinburgh University. He has published several books on risk issues and has specialised in research and reflection on the interactions between scientific knowledge, public policy and public responses in policy domains involving risk, like nuclear power, GMOs and climate change. In such work he has collaborated with relevant scientists and published several joint papers. As part of this interest in science and society, Brian has also conducted considerable research on 'public understanding of science' and was responsible for the earliest critiques of the dominant so-called 'deficit model' of the public's relationships with science. From 1994 to 2000 Brian was a member of the management board and scientific committee of the European Environment Agency, and was special adviser to the UK House of Lords Science and Technology select committee's March 2000 report, Science and Society. He is a member of the UK Biotechnology and Biological Sciences Research Council's expert committee on Responses to Public Concerns, and is involved in the European Commission's current development work on Science and Governance.
Why framing processes in risk science are important – but still neglected
It is typically widely assumed that the proper relationship of science to social considerations in policy making about issues like GMOs is represented by the so-called linear model. In this model, the facts (for example of risk) are first objectively assessed, then - and only then - are human values and commitments allowed to shape the policy outcomes. Whilst controlled scientific knowledge of risks and other consequences is clearly important where possible, this "facts first-values second" model overlooks two crucial problems: the first is uncertainty - scientific knowledge is not usually so free of uncertainties that it can determine clear policy conclusions; the second point is that the scientific knowledge is framed by subtle assumptions which are not usually visible or explicit, even to their authors. These are also typically not deliberate political commitments, yet when expressed as knowledge of a policy issue they have political implications.
An example from biotechnology is the way in which risk assessment of plant GMOs in the EU originally omitted questions about biodiversity altogether, and under the amended EU Directive now defines biodiversity in ways which for example exclude questions about possible changes in soil microbiota biodiversity brought about by GM crops. This issue also relates quite closely to the uncertainty issue for the GMOs case. Whatever specific intellectual form it takes, this kind of framing is inevitable; but unlike accepted models of scientific practice, it cannot be described as the formulation of explicit and falsifiable hypothesis which are skeptically tested. The scientific research and policy uses of it thus innocently reproduce those framing commitments without deliberate consideration of their validity and implications. The same is true of assumptions which environmental risk assessment has to make about farm crop-management practices. This presentation will review these problems with the role of scientific knowledge as policy authority for plant GMOs, and will explain a key distinction between science as a crucial instrument of policy, and science as the entrenched culture of policy. It will then discuss some possible ways of addressing these issues as they impinge on public confidence in science and innovation.