Saturday, September 07, 2013

But What Do We Mean By Consensus?

Guest essay by Christopher Monckton of Brenchley
 
Politicians pay for science, but scientists should not be politicians. Consensus is a political concept. Unwisely deployed, it can be damagingly anti-scientific. A reply to Naomi Oreskes (Nature, 4 September 2013).

Subject terms: Philosophy of science, consensus, climate change

The celebrated mathematician, astronomer and philosopher of science Abu Ali Ibn al-Haytham, or Alhazen, is justly celebrated as the founder of the scientific method. His image appears on Iraqi banknotes and on the postage stamps of half a dozen nations of the ummah wahida.

Al-Haytham, unlike Naomi Oreskes,[1] did not consider that consensus had any role in science. He wrote that “the seeker after truth” does not put his trust in any mere consensus, however venerable: instead, he submits what he has learned from it to reason and demonstration. Science is not a fashion statement, a political party or a belief system.

The objective of science, as of religion, is truth. Religion attains to the truth by accepting the Words of Messiahs or of Prophets and pondering these things in its heart[2]. Science attains to the truth by accepting no word as revealed and no hypothesis as tenable until it has been subjected to falsification by observation, measurement and the application of previously-established theory to the results.

The Royal Society’s dog-Latin motto, Nullius in verba, roughly translates as “We take no one’s word for it”. The Society says, “It is an expression of the determination of Fellows to withstand the domination of authority and to verify all statements by an appeal to facts determined by experiment.”[3] No room for consensus there.

clip_image004

T.H. Huxley, FRS, who defeated Bishop Wilberforce in the debate over evolution at the Oxford Museum of Natural History in 1860, put it this way: “The improver of natural knowledge absolutely refuses to acknowledge authority, as such. For him, scepticism is the highest of duties: blind faith the one unpardonable sin.”[4] Richard Feynman agreed: “Science,” he said, “is the belief in the ignorance of experts.”[5]

Karl Popper[6] formalized the scientific method as an iterative algorithm starting with a general problem. To address it, a scientist proposes a falsifiable hypothesis. During the error-elimination phase that follows, others demonstrate it, disprove it or, more often do neither, whereupon it gains some credibility not because a consensus of experts endorses it but because it has survived falsification. Head-counts, however expert the heads, play no part in science.

The post-modernist notion that science proceeds by the barnacle-like accretion of expert consensus on the hulk of a hypothesis is a conflation of two of the dozen sophistical fallacies excoriated by Aristotle[7] 2350 years ago as the commonest in human discourse. The medieval schoolmen later labelled them the fallacies of argument ad populum (consensus) and ad verecundiam (appeal to reputation).

Science has become a monopsony. Only one paying customer – the State – calls the tune, and expects its suppliers to sing from the same hymn-sheet. Governments, by definition and temperament interventionist, are disinclined to pay for inconvenient truths. They want results justifying further intervention, so they buy consensus.

The Hamelin problem is compounded by a little-regarded consequence of nearly all academics’ dependency upon the public treasury. Those whom the State feeds and houses will tend to support the interventionist faction, and may thus give a spurious legitimacy to a political consensus by parading it as scientific when it is not.

Too often what is really a political consensus will be loosely defined with care, allowing its adherents to pretend that widespread scientific endorsement of an uncontentious version implies support for a stronger but unsupported version.

Consider climate change. The uncontentious version of the climate consensus is that greenhouse gases cause warming. Oft-replicated experiment establishes that the quantum resonance that interaction with near-infrared radiation induces in a greenhouse-gas molecule, such as carbon dioxide, emits heat directly, as though a tiny radiator had been turned on. Thus, adding greenhouse gases to the air will cause some warming. Where – as here – the experimental result is undisputed because it is indisputable, there is no need to plead consensus.

The standard version of climate consensus, however, is stronger. It is that at least half the global warming since 1950 was anthropogenic.[8],[9] Supporters of the uncontentions version need not necessarily support this stronger version.

Though IPCC (2013) has arbitrarily elevated its level of confidence in the stronger version of consensus from 90% to 95%, Cook et al. (2013),[10] analyzing the abstracts of 11,944 papers on global climate change published between 1991 and 2012, marked only 64 abstracts as having explicitly endorsed it. Further examination[11] shows just 43 abstracts, or 0.3% of the sample, endorsing it.

No survey has tested endorsement of the still stronger catastrophist version that unless most CO2 emissions stop by 2050 there is a 10% probability[12],[13] that the world will end by 2100. The number of scientists endorsing this version of consensus may well be vanishingly different from zero.

The two key questions in the climate debate are how much warming we shall cause and whether mitigating it today would cost less than adapting to its net-adverse consequences the day after tomorrow. There is no consensus answer to the first. The consensus answer to the second may surprise.

Answering the “how-much-warming” question is difficult. Models overemphasize radiative transports, undervalue non-radiative transports such as evaporation and tropical afternoon convection, and largely neglect the powerfully homoeostatic effect of the great heat-sinks – ocean and space – that bound the atmosphere.

Absolute global temperatures have varied by only ±1% in 420,000 years[14]. Will thermometers be able to detect the consequences of our altering 1/3000 of the atmospheric mix by 2100?

Uncontroversially, direct radiative warming at CO2 doubling will be the product of the instantaneous or Planck parameter[15] 0.31 K W–1 m2 and the CO2 radiative forcing[16] 5.35 ln 2: i.e., ~1.2 K. Models near-triple this value by temperature feedback amplification. Yet no feedback can be measured directly or determined theoretically. Feedbacks may even be net-negative.[17],[18]

Another uncertainty is introduced by the amplification equation in the models, which was designed for electronic circuits, where it has a physical meaning. In the climate, as the singularity at a loop gain of 1 approaches, it has none. In a circuit, feedbacks driving voltage to the positive rail flick it to the negative rail as the loop gain exceeds 1. In the climate there is no such physical mechanism.

The chaoticity of the climate object is an additional, insuperable uncertainty.[19],[20] The IPCC admits this: “In climate research and modeling, we should recognize that we are dealing with a coupled non-linear chaotic system and, therefore, that the long-term prediction of future climate states is not possible.”[21]

The atmosphere, like any object that behaves chaotically, is highly sensitive to initial conditions. The available data will always be inadequate to allow reliable prediction – especially by probability distribution in model ensembles – of the chaos-driven bifurcations that make climate climate.

Given these real uncertainties, the IPCC’s claim of 95% consensus as to the relative contributions of Man and Nature to the 0.7 K global warming since 1950 is surely hubris. Nemesis is already at hand. Empirically, the models are not doing well. The first IPCC Assessment Report predicted global warming at 0.2-0.5 Cº/decade by now. Yet the observed trend on the HadCRUt4 data[22] since 1990, at little more than 0.1 Cº/decade, is below the IPCC’s least estimate.

Taking the mean of all five global-temperature datasets, there has been no global warming for almost 13 years, even though CO2 concentration increases should have caused at least 0.2 Cº warming since December 2000.

Given the Earth’s failure to warm as predicted, and the absence of support for the IPCC’s version of the climate consensus, its 95% confidence in the anthropogenic fraction of the 0.7 Cº warming since 1950 seems aprioristic.
 
clip_image012
No global warming for 12 years 8 months. Data sources: GISS, HadCRUt4, NCDC, RSS and UAH.
So to the economic question. Posit ad argumentum that the IPCC’s central estimate of 2.8 Cº warming from 2000-2100 is true, and that Stern[23] was right to say the cost of failing to prevent 2-3 Cº warming this century is ~1.5% of GDP. Then, even at a zero inter-temporal discount rate, the cost of abating this decade’s predicted warming of 0.17 Cº[24] by CO2-mitigation schemes whose unit mitigation cost is equivalent to that of, say, Australia’s carbon tax will be 50 times the cost of later adaptation.

How so? Australia emits just 1.2%[25],[26] of global anthropogenic CO2. No more than 5% of Australia’s emissions can now be cut this decade, so no more than 0.06% of global emissions will be abated by 2020. Then CO2 concentration will fall from the now-predicted 410 μatm[27] to 409.988 μatm. In turn, predicted temperature will fall, but only by 0.00005 Cº, or 1/1000 of the minimum detectable global temperature change. This is mainstream, consensus IPCC climatology.

The cost of this minuscule abatement over ten years will be $162 billion[28], equivalent to $3.2 quadrillion/Cº. Abating just the worldwide mean warming of 0.17 Cº predicted for this decade would cost $540 trillion, or $77,000/head worldwide, or 80% of ten years’ global GDP[29]. No surprise, then, that in the economic literature the near-unanimous consensus is that mitigation will cost more than adaptation[30],[31]. The premium vastly exceeds the cost of the risk insured. The cost of immediate mitigation typically exceeds by 1-2 orders of magnitude that of eventual adaptation.[32]

Accordingly, Oreskes’ statement that “Political leaders who deny the human role in climate change should be compared with the hierarchy of the Catholic church, who dismissed Galileo’s arguments for heliocentrism for fear of their social implications” is not only scientifically inappropriate but historically inapt: for no political leaders “deny the human role in climate change”, though some may legitimately doubt its magnitude or significance; and none impose any such opinion upon their citizens.

It is the true-believers in the New Religion of Thermageddon who have demanded that their opponents be put on trial for “treason” (Robert Kennedy), and for “high crimes against humanity” (James Hansen, NASA)[33]. The penalties for treason and for crimes against humanity are not the house arrest to which Galilei was sentenced, but death. Insistence upon consensus has often bred the most brutal kind of intolerance.

The true lesson of l’affaire Galilei, then, is that the governing class, then the high priests of Rome, now the acquiescent archdruids of academe and their paymaster the State, should not intolerantly abuse their power, then of theology, now of monopsony reinforcing peer-pressure rebranded as consensus, by interfering in scientists’ freedom to be what al-Haytham had beautifully called them: seekers after truth.

Reposted from What's Up With That. You can read the whole thing - including comments, footnotes, and illustrations - over there.