Risk, uncertainty and power

ANDY STIRLING

back to issue

IN Europe, as possibly in India, few phrases better express contemporary preoccupations in the governance of science and technology than the slogan ‘Knowledge Society!’ For the best part of a decade now, this agenda has underpinned pretty much every major area of EU and member state policy-making, extending far beyond research and innovation. In this essay, I unpack some implications of this mainstream vision in European politics, focusing special attention on how constrained and power-laden notions of knowledge play out in the governance of scientific uncertainty and technological risk. Though the detailed pathologies may be distinctively European, the diagnoses and prescriptions are likely to resonate in India.

A number of conclusions arise for building richer, more vibrant and diverse pathways for evolution of science, technology and wider knowledges alike. As with other set-piece political discourses, knowledge society visions are woven from both hope and fear. On the hope side, the mainstream European notion is an antedote to the ‘risk society’ – promising to beleaguered rulers some salvation from supposedly irrational and over-anxious citizens. In place of public concern and dissent, it prescribes uncritical faith in the trajectories for research and innovation favoured by incumbent institutions.

Yet, for historically sensitive and status conscious European leaders, this same mainstream vision also conjures up fears of late-colonial decline. It evokes a one-directional ‘race’ to a homogeneous global future, with Europe falling ever further behind competitors like the US – and increasingly India and China. Here, knowledge becomes just one further arena for asserting order and exclusion and exercising ambition and privilege even as thwarted European economic aspirations and anachronistic global identities lend a particular political charge.

 

It is ironic that much of this high-level ‘knowledge society’ talk actually ignores so much of what we know about knowledge. Knowledge is routinely represented as a discrete quantifiable medium – to be appropriated, accumulated, measured and distributed according to universal templates. This is a convenient construct. Yet, when seen from vantage points other than expediency, the kaleidoscope of human knowledges is best understood in quite different terms.

As established in many disciplines, the essence of knowledge is that it is diverse rather than monolithic, unruly rather than ordered, relational rather than material, indeterminate rather than measurable, distributed rather than concentrated, normative rather than instrumental, often tacit rather than fully codified, contestable rather than definitive, power-laden rather than innocent, and dynamic and turbulent rather than predictable and steadily increasing. What is striking about the current European vision of the knowledge society, then, is that in each of these different ways, it involves an untenable understanding of knowledge itself.

 

These mismatches between rhetorics and realities of knowledge are best substantiated by looking at the governance of scientific uncertainty and technological risk. Here, European policy making is increasingly riven with polarized debates over the role of two iconic institutional and procedural innovations: the ‘precautionary principle’ and ‘public participation’. Precaution is a response to scientific uncertainty in areas like food safety, GM crops and chemicals regulation. Modest European efforts to institutionalize precaution have helped provoke international trade disputes as also precipitated an indigenous industrial backlash. Likewise, pressures for greater public participation are becoming increasingly prominent as risk regulation moves increasingly to a European level and pressures mount to manage conflict and enhance trust.

Yet these innovations towards precaution and participation are strongly resisted by numerous high profile voices. Branded as indiscriminately ‘anti-science’ and ‘anti-technology’, both are portrayed as obstructing the kind of progress epitomized by the knowledge society. If this mainstream vision is to be realized, then there is little place for these kinds of institutionalized questioning or scepticism. The ‘way forward’ (typically invoked without specifying orientation!) is held to lie not in precaution or participation, but in greater deference to established scientific expertise and enhanced support for the existing directions of technological development.

 

This presents a further irony. Not only does the ‘knowledge society’ embody a narrow and simplistic notion of knowledge, when contrasted with visions of precaution and participation, it also involves reduced and impoverished representations of science. After all, science is often distinguished (somewhat uncritically after Merton) in relation to other areas of culture, as upholding values of communality, scepticism and questioning. Yet, just as these qualities help ensure robustness and authenticity within processes of scientific enquiry, so too might they serve the external steering of science itself.

It is only if scientific and technological evolution are conceived deterministically – as a single-track competitive race – that moves towards precaution and participation appear problematic. When knowledge and innovation are viewed more widely as branching evolutionary processes, we begin to see the role for explicit and deliberate social choice. Precaution and participation are about recognizing and facilitating this reality of choice – providing for deeper reflection, more transparent motivations and greater democratic accountability over which directions are pursued in research and innovation, and which are not – and why.

Far from being indiscriminately ‘anti-science’ or ‘anti-technology’, the kinds of public questioning and scepticism represented in precaution and participation are typically highly specific. Concerns repeatedly focus on very particular applications in fields like genetic modification and therapeutics, bio-informatics and reproductive technologies, hazardous chemicals and waste management, military uses of nanotechnology, private urban transport, and nuclear power and weapons. In each case, as many alternative applications of science and technology are upheld and supported as are critiqued.

For instance, in the above examples, favoured pathways include non-GM biotechnology, ecological farming, non-property-intensive preventive healthcare, open-source informatics, urban public transport; green chemistry, closed-cycle industrial ecologies, contained applications of nanotechnology, distributed renewable power and energy efficiency, and nonviolent approaches to security. Precaution and participation are thus as much about encouraging new directions for research and innovation as about challenging some that are already established.

 

At its root, European debates around precaution and participation concern the (largely invisible) entanglements of power in knowledge. The real challenges they present are not addressed within science and technology themselves, but the established cultures and practices through which these are governed. It is for this reason that attention is focused so strongly on institutions of risk regulation. As it stands, these offer the only real discursive spaces for formal questioning, scepticism or dissent over the general directions taken by science and technology. If they are to gain traction in current governance structures, concerns must be framed in restricted technical terms of hazards and probabilities. Any position that cannot be articulated in such terms is held to be intrinsically ‘unscientific’ and so beyond legitimate scope for consideration. In this way, prevailing ‘risk-based’ approaches involve a series of highly challengeable assumptions over the nature and scope of relevant knowledges in the governance of science and technology.

 

First, conventional risk regulation assumes that demonstrated efficacy of knowledge constitutes sufficient normative authority for acceptance of associated innovations. Only if countered by direct measurable harms is this qualified. But this template ill-suits such debates as those over medical technologies that increase disparities in lifespans between rich and poor, or enhance human faculties, or permit new forms of social surveillance or control, or enable character-selection in children. In these and other areas, claimed benefits are typically not examined symmetrically with harms, but effectively taken as given.

Risk regulation asks only whether such interventions are ‘safe’ in the narrowest of senses. As emphasized by Wynne, no room is left for scrutiny of the purposes and motivations driving favoured directions for science and technology. This excludes centuries of political thinking (as explored, for instance, by writers such as Aristotle, Kant and Habermas) over the insufficiency of knowledge as a moral basis for action. It ignores that ‘know why’ is as important as ‘know how’.

A second general assumption is that if knowledge is adequate to enable innovation, then it will also offer completeness in understanding the consequences. Despite repeated prior experience with unexpected carcinogenic, mutagenic, neurotoxic and reprotoxic effects of synthetic chemicals, this continues to be presumed – for instance in regulation of nano-materials.

To treat seriously only those forms of harm for which there is already extant evidence, excludes centuries-old insights from writers like Lao Tzu and Socrates up to economists like Knight and Keynes, that what we don’t know is as important as what we do know. Since contemplating the unknown necessarily requires imaginings beyond the available evidence, it is treated as unscientific in conventional risk regulation. What is truly unscientific, however, is this effective denial of the unknown.

 

A third assumption compounds this dilemma, but is distinct and less acknowledged. It concerns not incompleteness, but the intrinsic indeterminacy of even the most apparently complete knowledge. No matter how much we think we know, we will always be subject to surprise. This applies even where there is complacent confidence that we face Rumsfeld’s infamous ‘known knowns’.

Approval for halogenated hydro-carbon refrigerants and aerosols was initially driven by the confident knowledge that these substances were benign (according to known criteria). Likewise, endocrine-disruption in many household chemicals was missed because the mechanisms lay outside the array of known toxic effects. Even in the rigorously and exhaustively explored field of mathematics, Gödel showed axiomatically that completeness is always indeterminate. Writers on science and technology such as Collingridge and Ravetz highlight that we are nowhere more exposed to this ubiquitous inevitability of surprise than when entertaining hubristic perceptions of complete knowledge.

A fourth issue concerns the relationship between knowledge and ignorance, in that this is the inverse of what is conventionally presumed. Even if the above assumptions are avoided, it may be reasonable to expect that increasing knowledge will decrease ignorance. This is why risk assessment frequently makes provision for further research to increase confidence in resulting knowledge. Unfortunately, hard-won but oft-forgotten experience also shows this to be unfounded.

For example, advances in research and computing power reveal chaotic nonlinear dynamics in even the most highly determinate systems. Knowledges of specific outcomes in fields like climatology, oceanography and ecology have thus been recognized as less well-founded after such advances than before. Einstein’s much-credited analogy of a ‘circle’ of knowledge helps illuminate this: as the circle grows, so also increases the associated ‘circumference’ of experienced ignorance. This recognition of correlated (rather than inverse) relations between knowledge and ignorance, helps resolve apparent contradictions between the ‘knowledge society’ and the ‘risk society’.

 

A fifth assumption is that knowledge is additive. Even if it is conceded that more knowledge may increase ignorance, can we not also be confident that adding new knowledge at least means increasing our total knowledge? Unfortunately, this too is well established to be unfounded. Kuhn, for instance, highlighted that even in science, new knowledges are often incommensurable. In regulating GM crops, knowledges of geneticists, virologists, cell biologists, soil scientists, ecologists, agronomists, economists and sociologists are at various levels quite fundamentally in tension and so not amenable to simple aggregation. Knowledges of biotechnology entrepreneurs, chemical producers, plant breeders, industrial agriculturalists and subsistence farmers compound this. Arrow won a Nobel Prize for showing similar incommensurabilities in the heartland of neoclassical economics. Jasanoff details how they unfold between contrasting epistemologies of different social groups and political cultures. Yet, risk assessment persists in proceeding as if it were possible to add together different knowledge ‘inputs’ and arrive at just this kind of single ‘objective’ picture.

 

Sixth (and finally) risk regulation assumes the effective independence of facts and values. Yet, scholars such as Jasanoff and Wynne show that our understandings do not exist in innocent isolation, but are in fact reflexively intertwined with our interests. In sectors like nuclear power, genetically modified crops and nanotechnology, for instance, knowledges are actively shaped by our wider social, economic and technological commitments. Vast infrastructures are constructed on the basis of what we think we know. This does not simply increase exposure to associated ignorance, it also forms powerful pressures to exaggerate knowledge and suppress ignorance. The greater the commitment based on what we think we know, the greater the pressure to exclude that we might be wrong. This active political dynamic helps illuminate the otherwise inexplicable persistence in risk regulation of this series of highly expedient and instrumental – but manifestly false – assumptions about the nature of knowledge.

So what can be done about this sidelining of inconvenient attributes of knowledge in mainstream European visions of the knowledge society? What are the concrete implications of the rather daunting challenges for governance of research and innovation? How might we catalyse more sophisticated and nuanced styles of policy making on science and technology? What are the specific lessons of precaution and participation for risk regulation? It is to these practical challenges that the final part of this essay now turns.

 

A pragmatic starting point here is to be as persuasive as possible to those who remain attached to restricted, prosaic, risk-based understandings of knowledge. As with the mythical Trojan Horse, it is shared commitments (not pitched belligerence) that may best open the most firmly closed of gates. Such an entry point is provided by the fundamental distinction in technical notions of risk between two basic aspects of knowledge. The first concerns the possible things that might happen – the forms of harm, the potential effects, and the social outcomes. The second concerns the degrees of likelihood attached to these possibilities – conventionally expressed as numerical probabilities. The essence of the above discussion is that our knowledges in each of these dimensions (as others) may variously be regarded as adequate or problematic.

The logical possibilities that flow from this are represented in the simple matrix in Figure 1. This is not a taxonomy of cases or circumstances, but a heuristic expressing different styles of understanding. It is a gate-way through which to convey more subtle and critical appreciations of the role of knowledge. In any given case, understandings will vary and often be mixed. The purpose of this picture is to illuminate the need for humility, reflection and transparency over the kinds of assumption discussed above.

Starting in the top left quadrant of Figure 1, then, conventional regulation of science and technology presumes confidence in the high quality of knowledges on both axes. This is the rigorous ‘sound scientific’ definition of ‘risk’ underlying the established methods shown here. Although such techniques are often not fully implemented in practice, they nonetheless provide the general quantitative idiom of regulatory ‘risk’. Here we find contexts in which these risk-based approaches may under some views appear quite unproblematic. Examples include relatively familiar, static, closed, deterministic systems – as often seen in structural or mechanical engineering or transportation, in well-documented epidaemiologies or flood risks under normal conditions.

 

The real problem lies where acknowledged utility of such methods in such restricted contexts is taken to demonstrate their more general sufficiency. Elisions between strict ‘scientific’ and loose colloquial meanings of ‘risk’ compound confusion over the real applicability of these techniques, even in their own terms. This expedient double meaning of ‘risk’ is also open to active strategic manipulation. It is this dynamic that is fundamentally challenged by precaution and participation.

 

In the lower left of Figure 1, we find the strictly-defined condition of uncertainty. Here, we are confident in our knowledges of possible outcomes. But the available empirical information or analytical models are not seen to present definitive grounds for assigning probabilities. Examples may be found in dynamic, open, indeterminate systems, like complex energy or communications infrastructures, patterns of use of carcinogenic chemicals, epidaemiologies of novel pathogens, or flood risks under climate change. It is under these conditions, according to the celebrated probability theorist de Finetti, that ‘probability does not exist.’ Of course, we can still exercise subjective judgments and treat these as a basis for systematic analysis.

FIGURE 1

Methodological Responses to Problematic Knowledge

However, the challenge of uncertainty is that such judgments may take different equally-plausible forms. Rather than reducing these to a single expected value (or prescriptive recommendation) – as is normal in risk assessment – the scientifically rigorous approach is (contrary to ‘sound science’ rhetoric) actually to acknowledge this subjective diversity. Again, this contrasts with colloquial (and actively strategic) usages of the term ‘uncertainty’, which is often taken to accommodate the same restricted risk-based methods. This usage effectively denies the value of a range of practical systematic methods for addressing strict uncertainty, as shown in Figure 1. It is a fundamental message of precaution that such attempts to assert aggregated risk-based understandings under uncertainty are neither rational nor scientifically rigorous.

 

Under conditions of ambiguity (top right quadrant), it is not knowledges of likelihoods, but of the possibilities themselves, that is problematic. This may apply where outcomes are thought to be certain – or have even occurred already. Here, ambiguities still arise in characterizing outcomes and interpreting their normative meanings: for instance in selecting, partitioning, bounding, measuring or prioritizing, different possibilities. Put simply, ambiguity raises the thorny problem of ‘apples and oranges’. How should we compare benefits with harm, or different forms of harm (eg: impacts on rich or poor; workers or public; children or adults; present or future generations, humans or nonhumans)? Extending attention to broader values, ethics and ontologies, reveals even greater complexity and intractability. Such dilemmas are routine where there are contending experts or multiple disciplines, or where issues emerge over equity, trust or compliance.

In the regulation of GM food, for example, we find conflicting ecological, agronomic, safety, economic and social perspectives. Issues come to a head in considering the right questions to pose in regulation – asking which option is: ‘safe’, ‘safe enough’, ‘acceptable’, or ‘best’, may each yield radically different answers. The reasonable response to ambiguity is to entertain a plurality of different interpretations. Whilst none presents a panacea, a series of possible approaches to this ‘opening up’ of ambiguity is highlighted in Figure 1. The bottom line remains that under ambiguity, conventional reduction of relevant knowledges to aggregated notions of risk is even less rational than under uncertainty. This is the challenge addressed by moves towards more participatory institutions.

 

In the bottom right corner, there is the condition of ignorance. Here lie some of the most profound difficulties in conventional regulatory approaches discussed earlier, in that it is acknowledged that neither probabilities nor outcomes can be definitively characterized. When ‘we don’t know what we don’t know’, we face the ever-present prospect of surprise. Ignorance differs from uncertainty, which focuses on agreed known parameters like carcinogenicity or flood damage. It differs from ambiguity in that parameters are not just contestable, under-characterized, or indeterminate in their relative importance but are unbounded or at least partly unknown. Some of the most important environmental issues of our time have involved challenges that were – at their outset – of just this kind.

In the early histories of stratospheric ozone depletion, ‘mad cow disease’ and endocrine disrupting chemicals, for instance, the initial problem was not so much divergent expert views or mistaken understandings of likelihood, but straightforward ignorance over the possibilities themselves. Here, more than anywhere, it is profoundly irrational and unscientific to seek to represent ignorance as risk. Yet, once again, Figure 1 indicates a range of practices and strategies for producing and articulating diverse knowledges and so fostering greater awareness, reflexivity and humility. The earlier discussion shows that these cannot in any simple way ‘reduce’ ignorance, or limit our exposure to it. Yet they can assist in informing a more socially robust approach to understanding and acting on the consequences.

By showing how these responses to ignorance relate to the other approaches, the key message in Figure 1 is one of pluralism. There are many ‘tools in the toolbox’. Many of the most important challenges of knowledge in technology governance are entirely unsuitable for hitting with the ubiquitous risk assessment hammer. Most crucially, this rich diversity of methods must be embedded in the wider discourses, institutions and practices pioneered in current moves towards participation and precaution.

 

The final question raised by this picture, however, is why there remains such adherence – both in Europe and more generally – to such manifestly restricted discourses, constrained institutions and insufficient methods in the governance of science and technology? Why are mainstream visions of the ‘knowledge society’ so persistently impoverished? Why are the concrete implications and pragmatic prescriptions of precaution and participation so comprehensively challenged and marginalized? Why do we remain fixated with such a limited array of risk-based tools?

This brings us back to the theme with which we began: the conditioning role of power in shaping our knowledges. This time, the knowledges in question are our understandings of knowledge itself. Figure 2 makes use of the same heuristic framework employed in Figure 1, to illustrate the effect of these epistemic dynamics of power. It shows how an array of different pressures, practices and institutions tend consistently to move understandings of knowledge away from the lower right hand quadrant and towards the upper left.

 

Liability law, for instance, often effectively allows private decision makers to ignore those possible outcomes which may reasonably be claimed to be unknown. Even if these outcomes actually transpire, decision makers will be protected from the repercussions of clearly fallacious assumptions that unknowns do not exist. In this way, liability law moves the beneficiaries from a condition of ignorance to one of effective uncertainty. Likewise, the practice of insurance converts strict conditions of uncertainty into a much more tractable state of actuarial risk – at least for those who are protected by the terms of the contracts. Although in many ways very different, both optimising economic analysis and the procedures of deliberative consensus have the effect of converting conditions of ambiguity into a much more expedient, aggregated form of risk.

In all these examples – as in the others indicated in Figure 2 – the effect is to ‘close down’ what counts as a legitimate or plausible representation of knowledge. This in turn provides the vital political resource of justification – allowing ‘decisions’ to be conceived, asserted and defended, and ‘trust’ and ‘blame’ to be effectively managed. As a result, powerful incumbent interests manage to externalize the consequences of intractability. The inconvenient limitations of knowledge do not disappear, of course, but are simply rendered invisible. And this is only until they accumulate sufficiently to ‘bite back’ with the tragic visibility witnessed at Bhopal, Chernobyl and the global ‘credit crunch’. In this self-reinforcing dance of imperatives, restricted risk-based methods are both produced by – and actively help produce – the wider political dynamics. This is the predicament neatly described by Beck as ‘organized irresponsibility’.

 

In its overall conclusions, this analysis is hardly surprising. To anyone who is politically aware, it is little more than common sense. What is relatively unusual – at least in the context of current European debates – is explicit acknowledgement of the politics underlying our understandings of knowledge. It is because such discussion remains unusual that democratic engagement in the governance of science and innovation continues (if present at all) to be confined to the narrow terms of risk regulation. This is why terminologies of risk and uncertainty remain so expediently confused. This is how there persist such spurious anxieties over perceived tensions between precaution and participation (on the one hand) and wider aspirations to a ‘knowledge society’ (on the other).

 

When knowledge is appreciated more realistically – as plural, distributed and incommensurable – then the crucial issues become much clearer. The key challenge is not one of contention between general ‘pro-’ and ‘anti-’ positions in a single track ‘race’ to progress an undifferentiated and pre-determined science and technology. Only then, would it just be about: ‘how far’; ‘how fast’; ‘who leads’, ‘what really lies at stake’, ‘what are the future directions for science, technology, research and innovation – and for our multiple knowledges themselves.’ Here, there lie far deeper queries over: ‘which way’; ‘who says’ and ‘why’. When Europe, or for that matter India, finds itself hosting a vibrant politics of open and mature deliberation over these kinds of question, then we will know that the ‘knowledge society’ has truly arrived.

FIGURE 2

Effects of Power on Representations of Knowledge

 

top