Here’s something you probably hadn’t considered: the computing power necessary to model future climate change is extraordinarily energy intensive—so much so that scientists are holding back the construction of better machinery as they worry about carbon footprints. The New York Times reports:
[Climate modeling machines] will need to be more than 100 times faster than today’s most powerful supercomputers, and ironically, such an effort to better understand the threat of climate change could actually contribute to global warming. If such a computer were built using today’s technologies, a so-called exascale computer would consume electricity equivalent to 200,000 homes and might cost $20 million or more annually to operate.
For that reason, scientists planning the construction of these ultrafast machines have been stalled while they wait for yet-to-emerge low-power computing techniques capable of significantly reducing the power requirements for an exascale computer.
Will greens point to concern over supercomputers’ energy consumption to explain the persistent failures of climate models? It’s certainly a handy excuse. But beyond the easy jabs news like this provides, there’s a deeper problem: the sheer size and complexity of the system we’re studying.
The fallibility of even our best climate models is easy to understand. Our planet’s climate is one of the most complicated systems we’ve ever endeavored to comprehend, full of innumerable variables and a web of relationships we’ve only just started to scratch the surface of. It’s a dizzyingly difficult task, and for obvious reasons an important one, but we’re far from ascending to the realm of “settled science”, no matter what the environmental movement might have you believe.
We know that humanity is emitting greenhouse gases at alarming rates, and that these gases trap the sun’s radiation in our atmosphere, necessarily leading to increased surface temperatures. But beyond that we’re bewildered, and apparently we can’t yet employ our best technology to dissect the problem out of fear of contributing to that very warming.
The NYT story highlights Rice University computer scientist Krishna Palem who suggests embracing “inexact” computing, arguing that the more advanced (and therefore more energy hungry) aren’t necessary for climate science. In his words, “[s]cientific calculations like weather and climate modeling are generally, inherently inexact…We’ve shown that using inexact computation techniques need not degrade the quality of the weather-climate simulation.”
That’s hard to stomach, given how off our current generation of inexact models have been in predicting surface temperatures these recent years. Pragmatically speaking, it could reflect an unfortunate truth: we have to work with the computers we have, until faster and more efficient machines are developed. But let’s not pretend climate science is farther along than it actually is, or that there isn’t a real need to devote top-of-the-line resources to its study.