When we need to study something in science that is too large or complex to easily work with, we’ll often make a model of it in order to recreate it in a simpler way. You might not realize it, but we actually use models all the time in our everyday lives. They can be static and representational, like the New York City subway map – or they can be used to make predictions of dynamic systems, like a weather forecast or the ups-and-downs of the stock market.
Even our understanding of climate change, a phenomenon on the tip of everyone’s tongues, is based on predictions made by climate models. Global Climate Models, otherwise known as GCMs, are gigantic computer codes of mathematical representations of our Earth’s climate system, and they are generated based on the laws of physics. Let’s take a look at how it works. Energy enters the atmosphere as solar radiation, which then transfers and circulates within the earth’s system. To track how the energy flows through this system, GCMs try to simulate the earth’s climate conditions.
Initial climate state
They account for processes such as solar inputs, the heating and reflection from the land and water, the movement of air masses, and ice-albedo response. GCMs divide the earth into thousands of gridded 3-D cells. Each grid box holds a single value of the atmospheric variable such as winds, temperature, pressure, humidity, cloud cover and cloud ice that is being predicted. In the past thirty years, due to advancement in computational power, the size of the gridded boxes has been reduced, and the mathematical equations are more accurately solved. As the earth is divided into discrete cells, so too is time divided into finite intervals, or time-steps. If we want to compute quickly the global climate in 10 years, we might be tempted to do it in one go, with a time-step of 10 years. Such a large time-step is impossible though, because the energy we are accounting for within a cell would have moved on into adjacent cells.
Movement of air, for example, is a factor that may only need a small time-step of 30 minutes or less. So instead, GCMs take an initial climate state and calculate the climate state one time-step away, after which they repeat the calculation with the resulting new climate state. This is repeated until the total number of time-steps reaches our desired target time of 10 years. It might seem like there would be lots of room for error in so much processing, but think of it this way. When we flip a coin, it is hard to predict whether the result will be heads or tails. However, over many flips, we can reliably say that the probability of a heads is 50%. GCMs have time-scales that are decades long, which means hundreds of thousands of time-steps, or coin tosses. Because GCMs simulate the earth’s climate over large time-scales, many consider GCMs to be robust in comparison to shorter time-scale models. To test the GCMs, scientists have gone back and used the models to recreate climate history. This way, they can compare the models’ predictions with actual empirical evidence recorded in the past.
Factors that affecting climate
They started at 1900, and attempted to predict the climate changes over the next 100+ years – a century in which an unprecedented spike in temperature occurred. They ran the model twice: the first time, they only included natural forces. The second time, they included human factors in the equations, accounting for the increased addition of greenhouse gases in the atmosphere. Comparing the observed global and regional climate data from 100+ years, the first round of GCMs predictions didn’t match up. But when the models ran the second time, accounting for both natural and human factors affecting climate, the results were incredible. Not only do the predictions lie within a 95% confidence interval of the actual data, but they are correct for both the continents and the oceans individually. And it wasn’t only one GCM that found this to be true – the results of every single one of the 15 independently run GCMs agreed. This allows scientists to attribute the climate changes they have observed to human factors such as increased industrialization, deforestation and urbanizations which lead to an increase in the concentration of greenhouse gases. If you’re still skeptical, let’s take a look at another example.
When the first sophisticated GCMs were developed and run in the 1990s, the predictions they generated didn’t seem to make any sense. They indicated that there would be a rapid heating of the upper layers of the troposphere and a corresponding cooling of the lower levels of the stratosphere. But when the recorded data from 1960 onwards was examined, it turned out that the GCM’s predictions were verified. The cause was, once again, the effect of greenhouse gases. These gases, concentrated in the troposphere, were trapping low frequency radiation — causing the unexpected warming of the troposphere and the corresponding cooling of the stratosphere. So even though not all models make correct predictions every single time, the GCMs have proven over and over again to be reliable and sophisticated systems. And if the past is any indication of the future, we would be wise to heed its predictions..