The UN’s climate summit in Paris at the end of 2015 concluded with a bang. The world’s governments promised sweeping cuts in carbon emissions. Rich countries promised to help poor ones with $100 billion per year in climate assistance. President Obama quickly declared the agreement “the best chance we have to save the one planet we’ve got.” The consensus quickly jelled that this was a major, historic achievement.Then came the fizzle: The agreement is non-binding. Secretary of State John Kerry asserted on NBC’s Meet the Press that compliance would be enforced through the “powerful weapon” of public shaming, apparently implying a policy of verbal confrontation toward states that fall short. The Danish scientist Bjørn Lomborg, a prominent critic of the top-down international conference approach to climate change, called the Paris agreement the “costliest in history” if implemented. According to Lomborg, the agreement would “reduce temperatures by 2100 by just 0.05 degrees Celsius (0.09 degrees Fahrenheit)…. This is simply cynical political theater, meant to convince us that our leaders are taking serious action…a phenomenally expensive but almost empty gesture.” NASA scientist Jim Hansen, one of the earliest proponents of the idea that global warming is manmade, slammed the deal as “half-assed and half-baked,” a “fake,” and a “fraud.”Hansen’s assessment is probably close to the mark—and he and his fellow alarmists have only themselves to blame. While those who flatly deny the possibility of any global warming can be readily brushed aside, the alarmists have been much too quick to dismiss legitimate questions about precisely what the evidence shows. Indeed, they have frequently treated such questions as heresies to be persecuted, adopting an even more virulently anti-scientific mindset than the one they accuse others of.Meanwhile, on the policy side, the alarmists’ call for worldwide economic controls, including caps on fossil fuels, are largely recycled from previous scientific doomsday fads, such as the oil scarcity scare of the late 1970s. Despite the enormous costs these policies would impose, especially on poor countries, they would do virtually nothing to stop anthropogenic climate change, let alone protect anyone from relentless natural climate change that is one of our planet’s most prominent and inescapable features. They are also distracting attention both from investments that would make society less vulnerable to climate change, and from a more pressing crisis, namely the extinction of a large fraction of the world’s plant and animal species due to widespread modification of natural habitat.Don’t be fooled by the fanfare in Paris: The climate change movement faces big trouble ahead. Its principal propositions contain two major fallacies that can only become more glaring with time. First, in stark contrast to popular belief and to the public statements of government officials and many scientists, the science on which the dire predictions of manmade climate change is based is nowhere near the level of understanding or certainty that popular discourse commonly ascribes to it. Second, and relatedly, the movement’s embrace of an absolute form of the precautionary principle distorts rational cost-benefit analysis, or throws it out the window altogether.As the costs of decarbonization start to hit home, and the public demands greater certainty about the benefits to be gained, the public—and particularly those industries that are hardest hit—will invest in scientific research, in the hopes of achieving a more granular cost-benefit analysis. Something similar is happening to proposed listings under the Endangered Species Act—where major economic interests are threatened, they have responded with enormous investments in scientific research in order to show either that the species in question is not in danger, or that it can be protected by measures far short of the often draconian prohibitions imposed pursuant to the Act.These factors will almost certainly produce a more nuanced and less messianic view of the climate problem, with solutions aimed to maximize “bang for the buck” at the margins, where climate threats are most grave, rather than reordering human society in order to “save” a planet that, in the grand scheme of things, is quite indifferent to the state of the climate at any given time.All sides of the climate change debate have a huge incentive to generate more and better climate science: the alarmists and their more skeptical colleagues all want to prove their points. As our scientific understanding improves, many of the propositions we hear today will have to be modified, and many will be refuted, as has always happened in the history of science. The scientific community may at times be powerfully resistant to revision of its received wisdoms; it took an entire generation for medical professionals to accept the germ theory of disease, despite the fact that the evidence in its favor generated by Pasteur and Koch was clear from the start. But better science wins out in the end.The greater clarity that better science will bring will open up new opportunities to solve environmental problems both known and unknown, and not a moment too soon. The human race faces challenges that cannot effectively be met at a local or even a national level. These challenges will not be met by a wholesale reordering of human society from the top down, as many of the more authoritarian-minded environmentalists wish. Any attempt to impose command-economy solutions on a global scale will fall far short or outright fail, as the Paris agreement and its precursors show. The right strategy for confronting environmental challenges will have to be based on rational market incentives, rational cost-benefit analysis, and a broad-based consensus about the vital importance of efficient markets. Strategies that distort rational cost-benefit analysis (or the science on which it is based) to suit an anti-market agenda will not work and can only maintain the illusion of legitimacy for so long before they are discredited.A Brief History of the Pleistocene Ice AgeIt’s an amusing irony that fears of global warming have arisen during what is technically an ice age, namely the Pleistocene Ice Age, which began about 2.6 million years ago. A geological “ice age” typically lasts millions of years and is characterized by cycles of glaciation, during which glaciers grow and oceans recede, punctuated by warmer interglacial periods, in which glaciers recede and oceans rise—such as the current Holocene interglacial, in which human civilization has flourished.During glacial periods, the northern hemisphere becomes substantially covered in glaciers, typically several kilometers thick. An abundance of data (from isotopes in ice sheets and the ocean floor, to the fossil record) enables us to reconstruct much of its history. During glaciations, average temperatures typically drop about 20 degrees Celsius below today’s, and sea levels drop about 400 feet below where they are now. During the coldest points in those glacial periods, an adventurous animal can walk from England across Europe and Asia to North America without getting its feet wet. This almost certainly explains the Asian origins of native American populations, which are thought to have crossed the current Bering Strait on foot in repeated waves between 80,000 and 12,000 years ago.The last glacial period ended starting about 18,000 years ago, the height of the Wisconsonian glaciation, when the first of several dramatic warming trends began. Average temperatures rose and fell and rose again by 20 degrees Celsius in barely 5,000 years, less than the time between Sumerian civilization and the present day. Sea levels, which lag temperature swings by long periods of time, rose 300 feet between 15,000 and 8,000 years ago. That’s an average of more than one meter per century. Among humans, sedentary agriculture first arose when temperatures stabilized near current levels about 12,000 years ago. There was at least one settled community that reached a population of 8,000 inhabitants in Turkey some 9,500 years ago. At the dawn of civilization, man would have experienced floods on a biblical scale.In fact, major environmental changes have happened in time scales that are readily understandable in terms of human history. The Baltic Sea, for example is typically a freshwater glacial-runoff lake that disappears completely during glacial periods. When the North Sea finally rose high enough to breach the land bridge between Denmark and Sweden there were already large settled communities practicing agriculture. The saltwater ecology along the western edge of the Baltic Sea is not much older than the first pyramids.Before the interglacial period began 18,000 years ago, most of North America was buried under a vast sheet of ice. That glacial period (“ice age” in common parlance) lasted more than 100,000 years, though with significant variations in temperature, glacier cover, and sea levels—mini-cycles referred to as “stadials” and “interstadials.” During that time, anatomically modern humans spread throughout the world across land bridges that connected most of the continents. Our ancestors dominated the warmer climes and competed with Neanderthals for food across the tundra and ice of Europe.The warm interglacial period before that glacial period only lasted from 145,000 to 127,000 years ago. At their maximum, temperatures were significantly warmer than today, with ocean levels about thirty feet higher. Evidence in the form of algae fossils suggests that during at least some part of this period, the Arctic Ocean was completely free of ice cover, during the summer months if not year round. Only Antarctica retained its vast ice sheets and glaciers.This repeating cycle of 100,000-year glaciations and 10,000 to 20,000 year interglacials has been fairly consistent over the past 2.6 million years. The planet has trundled through the entire cycle dozens of times. If the pattern holds, we are due for another major glaciation sometime in the next several thousand years: The northern hemisphere will again become substantially covered in glaciers, ocean levels will fall hundreds of feet, and the earth’s overall production of plant biomass will fall substantially below what the current human population needs to feed itself. That will pose some ticklish technological challenges even for our hyper-adaptable species. Hopefully, such changes will be incremental enough to allow for adaptation.It’s impossible to say when (or even whether) this next “ice age” will come, partly because the scientific theories of what drives these epochal glacial cycles are all underdetermined—that is, theories explain part of the climate variation but not all of it. For example, the start of interglacial periods seems correlated to variations in the earth’s orbit. But the extent of orbital “eccentricity” does not fully explain the amount of warming that occurs, implying that other factors and feedback amplifiers are also involved.It is true, and at least somewhat alarming, that the current atmospheric carbon dioxide level of 400 parts per million (ppm) is far higher than at any time in the past 800,000 years, almost entirely as a result of humans burning fossil fuels. What we hear less often, however, is that during the first 1.8 million years of the Pleistocene Ice Age, carbon dioxide levels were significantly higher than that. Major glaciation occurred a dozen or more times, without taking much notice at all of what should have been a much stronger greenhouse effect. And for 245 million years before that, carbon dioxide levels were vastly higher. So carbon dioxide levels are the highest they’ve been in 800,000 years, but they’re also among the lowest they’ve been in 245 million years. Compared with that 245 million-year record, pre-industrial carbon dioxide concentrations of 280 ppm were perhaps perilously close to the level, around 150 ppm, below which plants cannot grow. It’s always possible to have too much of a good thing, but it bears recalling that carbon dioxide has vital benefits. Plant photosynthesis, which sustains virtually all life on earth, requires an abundance of sunlight, water, and carbon dioxide.Heretical QuestionsIn political discourse, it is often necessary to simplify complex policy matters in order to make them accessible for public debate. But too much simplification can have the effect of stifling public discourse, as in this unfortunate State of the Union statement by President Obama: “The debate is over. Climate change is real.” Of course climate change is real. The climate is always changing. Only the most foolish of the President’s critics believe otherwise, and it doesn’t help his cause to demonstrate that he can be just as foolish.The evidence is overwhelming that the planet has been warming off and on for several centuries. There is also compelling evidence that at least some significant part of this warming is attributable to carbon dioxide from the burning of fossil fuels since the mid-20th century. There is good scientific reason to believe that increasing concentrations of greenhouse gases almost certainly constitute a net contribution to global warming. But crucial questions remain about the relative importance of natural factors that influence climate. One of these is the sequestration of carbon dioxide by biomass on land and in the oceans. Another concerns cloud cover, which reflects a large amount of solar radiation back into space, and which earlier models of climate change did not take into account (because it’s very hard to get right). The simple climate models of ten or twenty years ago are now showing their age amid a flood of new data, and the far more complex, uncertain, and varied picture those data illustrate. The President is therefore wrong in the sense that, for the most crucial scientific questions, the debate is just beginning.The questions begin with the fact that while there is some correlation between temperature trends over the recent past and “anthropogenic” (or human-caused) carbon dioxide, the correlation is not very strong. The shape of the warming curve does not track the shape of the curve for increased carbon dioxide concentrations. For example, About 40 percent of the warming since 1900 happened in the first half of the 20th century, when “anthropogenic” carbon dioxide was insignificant. That warming could not have been caused by human behavior. Then, from 1945 to 1975, just as major amounts of carbon dioxide from burning fossil fuels start to appear in the atmosphere, there was a major “hiatus” during which global average surface temperatures held steady or actually dropped slightly—again, no correlation. From 1975 to 2000 there appears to have been very rapid warming. But then, as anthropogenic carbon dioxide levels continued to increase, another hiatus in temperatures appears to have set in with the strong El Niño year in 1998. While there are major discrepancies among different data sets, and new data are still being collected, the IPCC’s latest report concedes that the rate of warming since 2000 has been substantially less than predicted by climate models in response to rising levels of carbon dioxide.The public debate is dominated by simplistic claims that “climate change is man-made,” which might lead one to think that all of the current warming trend is man-made. But nearly all climate scientists accept that many factors influence temperatures, including major shifts in patterns of ocean circulation (such as the very strong El Niño, largely responsible for the warm Christmas Day 2015 temperatures in North America), variations in the earth’s orbit, variations in solar activity, and volcanic activity. The “attribution statement” in the IPCC’s latest assessment report is carefully couched: “It is extremely likely that more than half of the observed increase in global average surface temperature from 1951 to 2010 was caused by the anthropogenic increase in GHG [greenhouse gases] and other anthropogenic forces together.”The distinction between “more than half” (the IPCC’s summary of scientific literature) and “all” or “nearly all” is crucial from the point of view of public policy. If only about half the observed warming is due to human activity, the cost-benefit analysis of currently proposed policies becomes far more dubious, and reveals another problem: As much as half the current warming trend (whatever that is) could be due to natural causes, and current policies will do nothing to address that.To see why, it’s crucial to focus on this precise scientific question: How much do temperatures actually increase when atmospheric carbon dioxide increases? Scientists express this relationship as a measure of “equilibrium climate sensitivity,” defined as how many degrees average global temperature will increase as a result of doubling atmospheric carbon dioxide. Virtually all the climate models used in the IPCC’s worst-case predictions of dangerous global warming presume a worst-case scenario equilibrium climate sensitivity (ECS) of 3.0 to 3.5 degrees Celsius. But leading IPCC scientists have concluded that if humans were responsible for all observed warming since 1971, the ECS would be around 2.0 degrees Celsius. And if humans are only responsible for about half of the observed warming, as the IPCC itself admits is quite possible, that implies an ECS closer to 1.0 degrees Celsius.A reliable figure for ECS continues to elude our grasp, but with an ECS of 1.0 degrees Celsius, the case for sweeping reductions in carbon emissions is greatly weakened. At that level of climate sensitivity, even if carbon dioxide emissions continue to increase unabated, temperatures would increase significantly less than the stated goal of the IPCC’s, which is warming of no more than two degrees Celsius by 2100. (This may explain why the Paris agreement moved the goalposts to a new goal of less than 1.5 degrees Celsius by 2100). But in that case, IPCC’s worst-case scenario would then be non-catastrophic by the IPCC’s own definition.The policy implications are dramatic. Under this scenario, which lies well within the IPCC’s forecast, even dramatic reductions in carbon dioxide emissions would have no measurable impact on temperatures. And the natural factors responsible for as much as half the recently observed warming would presumably continue warming the planet, oblivious to any reduction in carbon emissions.The key point is this: The IPCC’s latest “attribution statement” (extreme confidence that more than half the observed warming is due to humans) would be correct even if ECS is only 1.0 degrees Celsius and expected increases in carbon dioxide pose essentially no risk of catastrophic climate change.Far from honing in on a reliable value for the crucial ECS metric, more recent data have only increased the uncertainty surrounding it. As climate scientist Judith Curry of the Georgia Institute of Technology points out, while the IPCC’s AR4 report (2007) expressed an ECS “best estimate” of 3.0 degrees Celsius, the AR5 report (2013) doesn’t express a “best estimate” ECS at all. “The stated reason for not citing a best estimate in the AR5,” notes Curry, “is the substantial discrepancy between observation-based estimates of ECS (lower), versus estimates from climate models (higher).”This highlights an important self-correcting feature in the development of climate science. Yes, it’s true that many major journals reject articles that critique the current consensus, and that funding priorities strongly reinforce the consensus. But even the strong bias in favor of more dire findings, which has been introduced into scientific inquiry by the pervasive politicization of the issue, cannot readily invent false data. Every year produces more raw data than the year before, and the discrepancies between the new data and the simple climate models are increasing. Alarmists say that discrepancies are to be expected, and models are meant to be refined. But they have boxed themselves in with misleading claims to certain knowledge where in fact considerable uncertainty remains. Uncertainty about risks is not necessarily fatal to a policy of precaution, but false claims to certainty usually are, sooner or later. Witness the Iraq War and Saddam’s non-existent WMD.A close look at more specific questions reveals even deeper uncertainties. Antarctic sea ice has increased steadily since the 1970s. The planet appears to have been warming, generally though inconsistently, since the “Little Ice Age” of the late 18th century and early 19th century, when George Washington repeatedly recorded snowfall at Mount Vernon in early November. And even the Little Ice Age appears to have been just a hiatus in a broader warming trend that has lasted about 400 years. The early climate models of a few decades ago predicted that we had already, or would soon, exceed the maximum temperatures of both the Holocene Maximum 5,000 years ago and the Medieval Warming Period a thousand years ago, when Vikings wrote of grapes growing in Vinland (coastal North America and Newfoundland)—the two hottest periods of the current interglacial. Those predictions have not been borne out by more recent data: Average temperatures are still far short of both of those (relatively recent) peaks.The flood of new data has forced the United Nations to revise downward its prognostications of ocean-level increases. The worst-case sea level rise by 2100 was revised downward from 3.7 meters in the first IPCC report to 1.2 to in the second, to 0.8 in the third, and to 0.6 in the fourth. The most recent IPCC assessment report, AR5, expressed “medium confidence” that there is a least a 66 percent probability of sea levels rising from 0.45 meters to 0.82 meters in a high-carbon-emissions scenario.The prognostications of extreme weather, which alarmists matter-of-factly blame on anthropogenic climate change, have been increasingly discredited. Neither tornados nor hurricanes have been more frequent or intense since 1950 than they were in the half-century before—though we can’t know for sure because precise data on tornados and cyclones is too recent. Likewise, predictions of infectious diseases such as malaria spreading more widely have also been largely discounted in more recent IPCC reports, because vulnerability to infectious diseases correlates less with warm climates than with poverty and the poor public health conditions it entails, both of which risk being greatly exacerbated by strong decarbonization policies.This recurring failure to explain the past and predict the present hasn’t stopped alarmists from claiming, as the President did recently on Twitter, “Ninety-seven percent of scientists agree: climate change is real, man-made and dangerous.” The President perhaps was relying on the most famous of the surveys claiming a near-universal consensus on man-made climate change—a 2013 survey by the Australian John Cook and several colleagues in the journal Environmental Research Letters. Cook and his colleagues surveyed the abstracts of 11,944 peer-reviewed journal articles drawn from a simple database search of key terms. They classified the papers on the basis of whether the papers expressed an opinion on whether “humans are causing global warming” and, if so, whether the opinion embraced or rejected the consensus. They found that about two-thirds expressed no opinion, and of the third that did, nearly all embraced the consensus.
This is your free article this month. A quality publication is not cheap to produce.
Subscribe today and support The American Interest—only $2.99/month! Already a subscriber? Log in to make this banner go away.
Subscribe today and support The American Interest—only $2.99/month! Already a subscriber? Log in to make this banner go away.
Published on: March 31, 2016
Green IdolsTwilight of the Climate Change Movement Mario Loyola
Don’t be fooled by the post-Paris fanfare: The climate change movement faces big trouble ahead.
Mario Loyola is a senior fellow at the Wisconsin Institute for Law and Liberty.