Is climate research pseudo science?
Swedish original text from June, 2003, English translation from Nov, 2003, a few updates made since.
The answer is: Yes, to a large extent. There is, however, a dividing line precisely at the present time. Research about the past climate of the world is mostly science. Research about the future climate is almost entirely pseudo science.
My answer to the question in the headline may come as a surprise to most of the readers. Probably you have given another impression through mass media. However, I am able to support my claim with countless example, of which I will give a selection below. The results from this pseudo science is also brought to the public's attention through an UN organisation, giving it much credentials, at least in the eyes of the Europeans, who usually consider the UN a good organisation.
Climate research has been considered so important, that two UN organisations, the World Meteorological Organisation, WMO, and the UN Environmental Program, UNEP, joined and formed a third: the Intergovernmental Panel on Climate Research, IPCC, in 1988. The official task for IPCC has been to study if man changes climate, and if he does, by how much. From the very start, however, IPCC has been in the hand of what could be called climate catastrophists, so this organisation has never doubted that the human influence on climate is large and dangerous. A climate catastrophist is a person who forecasts climate catastrophes in the (near) future.
You may wonder what I mean with pseudo science? This is simply a phenomenon, which tries to appear as science, and may look as science to the laymen. However, on a closer inspection, you notice that it is not science at all. It lacks several of the criteria which science does satisfy. The scientific outlook is only superficial, and usually a dress followers of a belief has put on that belief to make it more appealing.
Sven-Ove Hansson is the name of a Swedish professor in philosophy, at the Royal Technological University in Stockholm. He has written a lot about pseudo science and developed seven criteria to distinguish pseudo science from real science. He has published it in Swedish, for example in a book with the title Vetenskap och Ovetenskap (meaning Science and Non-Science). As far as I know, he has not published anything in English, which is unfortunate since his criteria are good for discovering theories and hypothesis that really not are scientific.
I will only briefly mention the three of Hansson's criteria that I do not think climate science meet. The first of his criteria is belief in some kind of authority, who is considered to possess knowledge, which is unavailable to ordinary people. The second criterion is experiments that cannot be repeated. They can only, at best, be made by those who believe in a theory and not by other researchers. The seventh of the criteria is that explanations are abandoned without anything replacing them. The third, fourth, fifth and sixth of Hansson's criteria, however, I think it is obvious that the climate catastrophists' theories meet, and thus they should be called pseudo science.
The third of Hansson's criteria is that carefully selected samples are used when a random sample should be possible. One example of this in climate research is in fact the foundation of the view that the world is in the process of over-heating: There are many ways of calculating the average temperature of the world. The catastrophists have chosen the one, which gives the largest warming in the past couple of decades: To calculate mean temperatures from the network of weather stations spread across the globe. To save space, I will not give you all arguments for this method to a large extent giving urban warming and not global. Daly (2000) deals with this in detail, for those interested in those details. Those weather stations are to a large extent located where people are living. Thus, it is far from a random sample of the surface of the world. Since 1979, temperatures have been measured from space, with micro sounding units (MSU) from satellites. Not surprisingly, these measurements give a much smaller warming.
For a long a time, it was considered a scientific truth that the world was warmer than today during a period of about 400 years, ca 800-1200 AD. At its warmest, the world during Mediaeval Warm Period (MWP) may have been 1-2 C warmer than at present. Thereafter followed a little ice age (LIA), which essentially lasted until the mid-19th century, but was most severe in the 17th and 18th centuries. This cold maximum coincided with the Maraunder minimum, a period when the Sun is known to have had almost no sunspots. Therefore, it is believed that the Sun had something to do with the unusual coldness. This picture of the past climate was based on several indicators, such as tree rings, and also historical documents. Most of the material was from Europe, but it seemed to get support also in data from other parts of the world.
That the climate could vary so much by natural means was a problem for the climate catastrophists, who wanted to claim that (almost) all of the climate change of the 20th century was due to man. If natural variations could do this before 1850, maybe they were responsible also for changes after 1850? A few years ago, to the relief of the catastrophists, a new study, which questioned the old view (Mann et. al. 1999), was published. According to this new study, the world was somewhat colder 1000 years ago than it is today. Then followed 850 years of almost linear and slow cooling. In the middle of the 19th century this was abruptly switched to a warming, which is still continuing. One important conclusion of this study was that the 1990's probably were the warmest decade in at least 1000 years.
Common sense in science tells you to be a bit skeptic about any investigation which throws old truths away and gives a completely new picture. Preferably, scientists wait for more research, which either supports or disproves the new view, before they decide what to think about the new investigation. Not so in this case. IPCC immediately adopted Mann's reconstruction as the truth about the past climate of the world. The diagram from the study appeared in reports, journals, and conferences. The politicians and laymen should be in no doubt that the evolution of the world's climate in the past 1000 years was decided once and for all. But there is more. Mann's study was for the Northern Hemisphere only, but quite often the diagram was presented as showing the climate history of the entire world. Often the quite large error bars in the diagram had "disappeared", so that the uncertainty of past climate history was hidden.
A closer inspection of the study in question shows some peculiarities. The temperature indicators, which were used, in fact end in 1980, when the temperature is a little bit colder than in 1940. To give the impression of a rapid warming in the end of the 20th century, the calculations from the weather stations have been added to the diagram. When I looked at the table about the data used in the study, I found a few small errors. Although they should not have affected the result, it certainly does not give a good impression. The most important objection to the study is, however, that support for rewriting the climate history of the Earth in this way is missing. Later studies have not supported Mann's study. On the contrary. Briffa et. al. (2001), for example, has made their own study, as well as compilation of results from other studies. This shows that Mann's study have smaller temperature variations than any other study, as well as a less severe cold period in the 17th and 18th century.
The most important factor regulating climate, next to the Sun, is the oceans. They store heat corresponding to several years of insolation from the Sun. They interact with the atmosphere in complex processes, which are not fully understood. Three complex patterns of interaction seem more important than the others. There is the North Atlantic Oscillation (NAO), the Pacific Decadal Oscillation (PDO), and the El Niño-Southern Oscillation (ENSO) in the southern Pacific. The latter has two disturbances that occur semi-regularly with intervals of 2-7 years. They are rather famous and are called El Niño and La Niña. The first means that the normal circulation almost or entirely stops, and warm surface water remains just outside of the South American coast and is able to warm the atmosphere. El Niño years are usually warmer than normal years, and this phenomenon also affects monsoons in Africa and Asia, and precipitation in Australia and South America. La Niña is just the opposite, since it means that the circulation of surface water and the trade winds are strengthen, increasing the speed with which warm surface water is brought back to the ocean depths, and giving it less time to warm the atmosphere. La Niña years are usually a little colder than normal years. ENSO affects weather all over the world, while NAO and PDO at least affect the Northern Hemisphere. There is no evidence whatsoever that any of these three circulation patterns have been affected by humans. They are just natural variations continuing as they have for a very long time, probably many thousands of years.
In 1997-98 there was an El Niño which probably was the strongest since 1877-78. 1998 was also the warmest year in the MSU record. According to IPCC, 1998 was the warmest year in over thousand years, but there are reasons to doubt that. 1975, on the other hand, was a La Niña year, and colder than normal years. During the following two years, 1976 and 1977, the PDO switched to a warmer level, for some reason. This warming is also apparent in the calculation of the mean surface temperature. In the maps of world temperature trends that IPCC presents (IPCC 2001a, p 116 and IPCC 2001c, p 27), the period 1976-2000 is the last. This period has apparently been chosen to use natural events to create an as frightening picture of global warming as possible.
I have one more example of how carefully climate researchers chose their data series. Santer et. al. (1996) claimed to have found strong evidences that the human influence on climate already is noticeable. These evidences included a comparison of tropospheric temperatures, from radio sondes on weather balloons, with model results on how human emissions should affect climate. The correlation was very good. There was one, non-negligible problem, however. The researchers had selected the years 1963-1987 from a data series, which started in 1958 and still is increasing each year. At the time of the publication of the paper, the last few annual mean temperatures, those of the years 1992-1995, were below those of the first few years, 1958-61, and the trend had been towards colder temperatures since the late 1970's. Only by a "skilful" selection did these scientists manage to get a good correlation between observations and theory. Or, for that matter, the result that there had been a warming of the troposphere. When this fact was brought to their and the public's attention and they tried to defend against the critics (Nature 384, 522-524), part of their defence was that it can be expected that natural variations sometimes hide the trend expected from the theory about anthropogenic (that is, human induced) warming.
When reality and theory agree, it is good. There is no doubt about that. This is true in all science. But when reality and theory do not agree, there is still no problem, according to these researchers. This kind of reasoning is not found in science, only in pseudo science. This is in fact dealt with in the fifth and sixth of Hansson's criteria. The fifth says that counter evidences are ignored. The theory is claimed to be correct although there are facts which say the opposite. The sixth says that a pseudo scientific theory always have built-in excuses which only makes it possible to test the theory positively, but makes it impossible to design tests which disproves the theory.
The climate models used give too much warming of the world, compared to reality. This has even been alleged by IPCC (1996, p 295), which probably is one of the least known results from this organisation. The main reason for the failure of the models is that they contain amplification mechanisms that do not exist in the true atmosphere. A number appearing often in climate research is that a doubling of the CO2 concentration of the atmosphere would give a global warming of 1.2 °C, if everything else remains the same. According to the latest reports by IPCC, the expected global warming up to the year 2100 is 1.5-5.8 °C. This in spite of IPCC actually counts on the possibility that the CO2 concentration will not double (IPCC 2001a, appendix II). The temperature increase from an increasing greenhouse effect is expected to be amplified by water evaporating, increasing the amount of water vapour, which is another (even the most important, in fact) greenhouse gas in the atmosphere.
Models with this assumption give too much warming. Real scientist would then have realised that the assumption is wrong. There is a good reason to believe it to be wrong. Water vapour also forms clouds, which cools the surface by reflecting solar radiation back to space. More water vapour would also mean more clouds, working against the role of water vapour as a greenhouse gas. There are even facts that say that the assumption is wrong. Measurements in water containers at weather stations show that, during the last 50 years, the evaporation of water vapour has, in fact, decreased (Roderick and Farquhar, 2002), probably because of, precisely, increasing cloudiness.
Instead of acting like scientists, the climate catastrophists have come up with the hypothesis that the burning of oil, coal and natural gas creates large amounts of so-called aerosols. This is a kind of pollution that cools the atmosphere and hides (according to the climate catastrophists) the true warming from the anthropogenic greenhouse gases. There is no data whatsoever to support this assumption. Neither is it needed, if the ridiculous amplification effects are removed from the models.
In one further aspect do the models not agree with reality. According to the models, the polar areas should become warmer in a faster pace than the rest of the world. This has not happened. In Antarctica, except from the northernmost part of the Antarctic Peninsula, temperature measurements are only available since the end of the 1950's. These measurements show no trend whatsoever during this period. In the Arctic, there exist longer data series, sometimes starting in the middle of the 19th century. Most of the Arctic areas were at least as warm in the 1930's as they are now. Anyone may check this for themselves in the data available at http://www.giss.nasa.gov. That the models fail also in this case have not lead the climate catastrophists to abandoned them, because what they are involved in is pseudo science.
The fourth of Hansson's criteria is unwillingness to test the theory against reality, although it would be possible. A lot of climate data is now available on Internet, for example at Trends Online (http://cdiac.esd.ornl.gov/trends/trends.htm ). Of special interest is two data series. The first is estimates of carbon dioxide emissions from different sources (Marland et. al. 2002, Houghton & Hackler 2002). The second is measurements of carbon dioxide concentrations, in air bubbles, which have been trapped in the ice of glaciers (Etheridge et. al. 1998), and in the atmosphere since 1957, at Hawaii and at the South Pole (Keeling & Whorf 2002). It seems that most (all?) climate catastrophists never have encountered any data from the real world. For example, IPCC claims that if no action is taken to reduce the emissions, the carbon dioxide concentration of the atmosphere will double before the year 2100. In reality the concentration is now (early 2003) 370 parts per million (ppm) and has increased, in average, by 1.5 ppm, or 0.4% per year, for the last three decades (see for example IPCC 2001b, p 7). If the increase continues to be linear, 1.5 ppm/year, the increase up to the year 2100 will be about 40%. If the increase instead is exponential, 0.4%/year, the increase up to the year 2100 will be about 80%. As you see, both numbers are below 100%.
Half-life and (exponential) lifetime are well-known terms from radioactivity. The greenhouse gases, which humans emit, are not disappearing from the atmosphere through radioactive decay, but those terms still are useful. In a simple model about how carbon dioxide is removed from the atmosphere, the formulae become very similar to those of radioactive decay. In this simple model, the nature is assumed to have an equilibrium level (=the preindustrial average concentration). If the true value deviates from the equilibrium level, nature tries to return to the equilibrium, which of course can only happen with a certain delay. Nature's uptake of CO2 is directly proportional to the deviation from the equilibrium value. Due to the human emissions, nature today is far away from the equilibrium. Ahlberg (1998, 1999) and Dietze (2000, 2001), from such simple models, have concluded that IPCC overestimates future CO2 amounts. For carbon dioxide, IPCC claims that the lifetime cannot be determined, because so many processes are involved (IPCC 2001c, s 38). The range of values given for those processes are 5-200 years. This is a remarkable claim. Of course it is possible to determine the past lifetime from data on how the carbon dioxide concentration has varied as a reaction to the human emissions. These emissions come from many different sources. The dominating ones are those from exploration and use of fossil fuels. Another direct source is cement production. An indirect, and partly also direct, source is usage of land, for example cutting down forest to grow grain there. Cutting down forest reduces the potential for nature to take up carbon dioxide from the atmosphere. If the trees also are burnt, or are left rotting on the ground, they will become a direct source. For the direct sources, there are estimates on how large they have been every year since 1751. For the indirect sources, there are estimates since 1850. I have put these estimates, together with the CO2 concentration measurments mentioned earlier, into a simple model of the kind mentioned above. I find that the lifetime of carbon dioxide in the atmosphere is 27 year (with an uncertainty of only about 3 years) and that the equilibrium level is 287 ppm (see figure 1). The latter value is in good agreement with the preindustrial values from the Law Dome glacier ice. This lifetime means that 54 years after the extra carbon dioxide was emitted, only 14% of it remains in the atmosphere. Less than 3% remains 100 years after it was emitted. It is completely impossible to explain past CO2 concentrations with a lifetime much in excess of 27 years, if the estimates of emissions are correct.
That a simple model with only two parameters works so well in the time span of 150 years is surprising. It shows that the same process must have been involved all this time trying to eliminate the extra carbon dioxide. Unlike the advanced models, which are used by IPCC, this model cannot tell where the extra CO2 has gone. The advantage is that it fits the past data, unlike the advanced models.
There are also other facts indicating that the atmosphere is reacting fairly quickly to changes. During World War I and the early 1920's, the CO2 emissions were rather constant. During the 1940's, the CO2 concentrations remained almost constant. Shortly after the end of World War II, the emissions started to rise faster than before the war. It was an acceleration lasting only for a very short time, since the emissions seems to have increased linearly with time ever since then. For the CO2 concentration, the shift to a faster annual increase happened in the early 1960's. Thereafter the concentration has increased almost linearly with time. Both these events indicate that the reaction time of the atmosphere to changes in the emission rate is a few decades.
How about the scenarios used by IPCC, then? Let us, as an example, study the scenario labelled B1 (IPCC 2001a, Appendix II). There the emissions, that is, the sum of direct and indirect sources, continue to increase at more or less today's rate until the 2040's. Thereafter they start to decrease, and in the year 2100 the emissions are lower than today. The predicted concentration of CO2 in the atmosphere, however, continues to increase until the scenario ends in 2100. The model behind this scenario evidently has such a long lifetime for the human emissions, that most of them remains in the atmosphere 60 years after they were emitted. This model has either never been tested against reality, or has been tested and is still used although it cannot produce correct results. How could a model explain the future, when it cannot explain the past?
One thing entirely missing in the reports from IPCC is a comparison between the CO2 models and the real increase of carbon dioxide in the past. There is hardly any doubt that the comparison is missing because these models fail miserably and leave too much CO2 in the atmosphere, compared to the real world. However, we have one way of checking the models. IPCC started to publish prophecies in the early 1990's, including model results for the year 2000. By now, you probable will not be surprised to learn that all these models overestimated the CO2 concentration in that year?
Another kind of simple model is to assume that a certain percentage, say 50%, of the emissions remains in the atmosphere. I have also included such a model in figure 1. Evidently it fits well to the first 90 years of data, but after World War II it gives too high CO2 concentrations. This shows that less than half of the emissions actually remain in the atmosphere, unlike in the models used by IPCC.
The climate catastrophists' doomsday prophecies meet at least four of the seven criteria set up by Professor Hansson to discover pseudo science. I think that is enough to actually call it pseudo science. Hansson himself seems to consider one criterion met as enough. This is understandable. If only one of the criteria IS met, it cannot be real science.
Not only are the doomsday prophecies about the future climate pseudo science, it is also the most costly pseudo science in the contemporary world. This pseudo science is used to convince politicians that actions are needed to prevent the climate catastrophes prophesied. It has lead to the Kyoto treaty, an agreement claimed to "save" climate. In fact, it does nothing measurable to prevent climate change. Even the climate catastrophists admit this. Instead they see the Kyoto treaty as a first step. This first step should not be taken at all, because no real data from the real world indicate that the climate is need of rescue. Instead we should stop listening to them, and stop wasting money, which are used to produce even more worthless doomsday prophecies, in the form of unrealistic results from climate models. It is time to ensure that the research on future climate becomes a real science, and that resources are used to solve real problems.
Ahlbeck, Jarl, 1998: Atmosfärens halt av koldioxid, i Klimatpolitik efter Kyotomötet (In Swedish), Tor Ragnar Gerholm (Red.), s 93-101.
Ahlbeck, Jarl, 1999: Absorption of Carbon Dioxide from the Atmosphere, http://www.john-daly.com/co2-conc/ahl-co2.htm
Briffa. K.R., Osborn, T.J., Schweingruber, F.H., Harris, I.C., Jones, P.D., Shiyatov, S.G., and Vaganov, E.A. 2001: Low-frequency temperature variations from a northern tree ring density network, J. Geophys. Res. 106 D3, 2929-2941.
Daly, J., 2000: The Surface Record: 'Global Mean Temperature' and how it is determined at surface level. http://www.greeningearthsociety.org/Articles/2000/surface1.htm
Dietze, Peter, 2000: IPCC's Most Essential Model Errors, http://www.john-daly.com/forcing/moderr.htm
Dietze, Peter, 2001: Carbon Model Calculations, http://www.john-daly.com/dietze/cmodcalc.htm
D.M. Etheridge, L.P. Steele, R.L. Langenfelds, R.J. Francey, J.-M. Barnola and V.I. Morgan. 1998. Historical CO2 records from the Law Dome DE08, DE08-2, and DSS ice cores. In Trends: A Compendium of Data on Global Change. Carbon Dioxide Information Analysis Center, Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tenn., U.S.A
Houghton, R.A., and J.L. Hackler. 2002. Carbon Flux to the Atmosphere from Land-Use Changes. In Trends: A Compendium of Data on Global Change. Carbon Dioxide Information Analysis Center, Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tenn., U.S.A.
IPCC, 1996: Climate Change 1995 - The Science of Climate Change. Report of IPCC Working Group I. Cambridge: Cambridge University Press.
IPCC, 2001a: Climate Change 2001: The Scientific Basis. Contribution of Working Group I to the Third Assessment Report of the Intergovernmental Panel on Climate Change. J. T. Houghton, Y. Ding, D. J. Griggs, M. Noguer, P. J. van der Linden and D. Xiaosu (eds.). Cambridge University Press. Available online here
IPCC, 2001b: Summary for Policymakers. A Report of Working Group I of the Intergovernmental Panel on Climate Change. Available at http:://www.ipcc.ch.
IPCC, 2001c: Technical Summary. A Report accepted by Working Group I of the IPCC but not approved in detail. Available at http:://www.ipcc.ch.
Keeling, C.D. and T.P. Whorf. 2002. Atmospheric CO2 records from sites in the SIO air sampling network. In Trends: A Compendium of Data on Global Change. Carbon Dioxide Information Analysis Center, Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tenn., U.S.A.
http://cdiac.esd.ornl.gov/trends/co2/sio-mlo.htm (Mauna Loa, Hawaii)
http://cdiac.esd.ornl.gov/trends/co2/sio-spl.htm (Amundsen-Scottbase, South Pole, Antarctica)
Mann, M. E., Bradley, R. S. and Hughes, M. K., 1999: Northern Hemisphere Millennial Temperature Reconstruction, Geophysical Research Letters, 26, 759-762.
Marland, G., T.A. Boden, and R. J. Andres. 2002. Global, Regional, and National CO2 Emissions. In Trends: A Compendium of Data on Global Change. Carbon Dioxide Information Analysis Center, Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tenn., U.S.A.
Roderick, Michael L. and Farquhar, Graham D., 2002: The Cause of Decreased Pan Evaporation over the Past 50 Years", Science 298, 1410-1411.
Santer, B., Wigley, T., Jones, P., Mitchell, J., Oort, A, and Stouffer, R., 1996: A Search for Human Influence on the thermal Structure of the Atmosphere, Nature 382, 39-46.
Figure 1. The thick, black line is real CO2 concentrations, from air bubbles frozen into glacier ice at Law Dome in Antarctica (before 1958) and from direct measurements at the South Pole (1958 and onward). The thinnest, blue curve is CO2 concentrations calculated from the assumption that half of the emissions remains each year. The medium, green curve is a fit to emissions and concentrations, and has the preindustrial CO2 concentration=287 ppm and lifetime for the human CO2 emissions=27 years.
Back to first page