Not-So-Freqently Asked Questions: Spring 2001.

 

3. Uncertainty
Science and Public Affairs
December, 2000.
Wayland Kennet.
 
(pdf version)
 

When we think to encourage our children into "understanding science", and into "careers in science", what do we mean? Where does science actually fit in today's world which, together with its applications in technology, it is so largely shaping?

The genesis of science is the reduction of ignorance. Along the insecure frontier between ignorance and knowledge, different degrees of uncertainty prevail:

 

1. What we don't know but
know we could know.
Nice.
2. What we don't know but
think we could know.
Challenging.
3. What we don't know and
don't know whether we can know.
Tantalising.
4. What we don't know and
know we can't know.
Salutary.
  and  
5. What we don't know and
don't know we don't know.
Numinous, threatening, and
forgotten on pain of death.
 

The practical thrills and achievement of science come under 1 and 2; all of which is part of the epistemology of science, which though complicated is - or should be - clear.

The ethics of science is a more difficult, and murkier, scene. Voices are raised, and those unused to philosophical categories are strongly engaged. To them, the uncertainties of science are distasteful, and should not be harped on. Yet, as Sir Hermann Bondi, Scientific Adviser to several government departments, found,

"it was very much harder to convey scientific uncertainty and its limits than to explain clear knowledge".

["Science in Parliament", Vol. 56 No.4, Autumn 1999, p.5]

Today's most immediate drama - the BSE upheaval - concerns the handling of science's uncertainties. Under the last government officials and some vets in the Ministry of Agriculture knew there was a new cattle disease (BSE) which might have derived from some new feed produced, but not described, by the usual manufacturers. The disease might or might not be transmissible to other species, including humans. There was apparently an increase, and some concentrations, of the known "corresponding" human disease, CJD. Officials concealed this from ministers, and junior scientists were ordered to rewrite their reports. Later, when ministers were informed, they also decided to keep quiet. When certainty finally emerged, it turned out disastrous.

There had been ignorance and culpable concealment on the part of the industry, the relevant scientists, the officials and the ministers. In hindsight it is obvious that the Precautionary Principle should have been applied, but it was not.

Decisions which have to be taken on inadequate evidence are often about the safety of people - nearby, far away, unborn; animals; plants; and of course, last but not least, about the hopes of the scientist and of the scientist's political, commercial, and even academic masters. "Risk assessment" may take place, but it is an art, not a science: forecasting the future, on the basis of past experience, which may or may not have been scientifically evaluated. These decisions and those which later face his or her colleagues, professional body, regulators or tribunals, and at the top of the heap the country's legislators, are value, not scientific, judgements. They will be taken within the prevailing moral and political climate: there is no alternative.

This climate may well be decent, but there are moral microclimates, some good, some bad, and when it comes to that affecting decisions on the safety of the exploitation of new science, over the last twenty years it has not been very good.

What is the difference between "ethics" and "morality" ? Ethics is the actual analysis and discussion of what actions are right and what are wrong, and how one can know the difference. Thus the statement - "it is about saving lives, so it must be ethical" is ethically illiterate. Equally the two enjoinders "Thou shalt not kill", and "An eye for an eye", are both ethical statements: but we judge one right and the other wrong. Every human decision, action, word or gesture is susceptible of ethical analysis - including any prevailing moral climate.

Looming over the BSE saga, because smiled upon in high places, was the false god of economic growth measured in money terms - otherwise commercial profit. A risk to public health had been recognised, but to have taken the draconian measures needed to reduce it would have cost the industries money, or the state in making up their loss, or both. And both possibilities were hateful to the governments under which these policies began. This value judgment - the operational morality of the time - was deeply ingrained. We are now shocked, and rightly. A few of us are dead.

This elevation of economic growth to the throne of the social values is one thing, but it has to be considered together with another: the recent elevation of science itself to the throne earlier occupied by revealed religion. The Council for the Public Understanding of Science has sometimes sounded like the Society for the Propagation of Christian Knowledge: eager to preach the benefits of scientific achievement, slow to mention either the problems or the scale of uncertainty, and silent about the dangers the public can perfectly well see for itself.

Have the two even begun to fuse already? (The getting of wealth is the beginning of virtue, and it is a mainline duty of science to make it easier.) Today, the industry lobbies are bad-mouthing the Precautionary Principle, as if the examination of science's gift-horses were an anti-science activity. We are in any case bound to observe the Principle, through the Rio Declaration and the European Treaties. So how to interpret it?

A spokesman for Novartis a few months ago told the Parliamentary and Scientific Committee that:

"if the existence of reasonable doubt (about the safety of a product or procedure) is sufficient to justify 'the precautionary approach', then reasonable evidence should also be sufficient to allow the product/service to proceed."

The logical failure is striking: do Novartis researchers not know the distinction between the inadequacy of accumulated examples to verify a statement, and the sufficiency of one negative to falsify it? (If in practice they did not, the firm would soon be out of business.)

It seems to me salvation can only lie in the rigorous elucidation and generalisation of the Precautionary Principle. The Novartis formulation falls foul both of the first principle of risk analysis - start by assessing the state of your own (or public) ignorance - and of the best formulation of the precautionary principle, which goes:

"Absence of evidence of harm is not evidence of absence of harm."

About here, we also reach for Arthur Kornberg's immortal saying:

"I have yet to see any problem, however complicated, which, looked at in the right way, does not become still more complicated."

[From his "For the Love of Enzymes", Harvard, 1989.]

Yet last week, answering a question about "precaution in the face of uncertainty", as opposed to "precaution in the face of evidence of possible harm", Sir John Krebs, of the Food Protection Agency, suggested that

"Precaution can only be administered with evidence."

Which can be right only when the search for "evidence of harm" continues unabated. (That the EU Risks Assessment programme has 100,000 chemicals, 2000 of them "new", to examine, together with their interactions, indicates the scale of the exercise.)

So what do we do about it? The analogy between science and church may have an exploitable ethical spin-off, to which the British Association, in its "mission" to the young might look. Professsor Josef Rotblat, winner of the Nobel Peace Prize, perhaps the last survivor of the Los Alamos team, and certainly among its most ethically conscious, in 1995 put forward the idea of a quasi-Hippocratic oath for research scientists. It is after all the man - usually man - in the laboratory who has the bright idea for a new weapon "who is at the heart of the arms race", to quote Lord Zuckerman. And, with an idea for cheaper feed, he was at the heart of the BSE affair.

During the Cold War, some 70% of scientists are thought to have worked for, or at the expense of, the world's entirely secretive Defence Industries: many still do. Joe Rotblat had it in mind to discourage young science graduates from going into these industries, to release us from increasingly irrelevant expenditures and hideous misuse of scientific talent.

Since the end of the Cold War, an unknown but huge number of scientists have gone into the equally secretive world of the multinational corporations. A whole new calculus of environmental and health risk from mistaken judgments and irresponsible advocacies of new products and processes has come to the fore, and those who favour the oath could well tack these onto the Hippocratic conception; and add the duty to whistle-blow. (If not you, who? If not now, when?)

The oath would have to be voluntary, but no doubt the best - the most intellectually alert and politically and socially conscious of new graduates - would be the ones who took it first, and this should in theory lead to a downward trend in the competitive appeal of firms who recruited non-jurors (to use a historical name for those who will not swear to something).

One difficulty is the lack of experience of anything so wide. The Hippocratic oath itself - "First: do no harm" etc - was and remains intended only for those who belong to a registered corps of people wholly devoted to the medical care of individuals. It might seem that to scatter about the invitation to sign the new oath among beginners in many branches of knowledge - from pistons to prions, from cyber to behavioural psychology, from cutting electrons to cloning electors - could...well, could what?

It seems well worth working out.

 


© Wayland Kennet, 2000.
 


 


Get Acrobat Reader
 


 


 

Not-So-Frequently Asked Questions: Spring 2001.

 


 

 

 

 

 

 

 
 


Get Acrobat Reader