What is Climate Change?
Humanity has always had an interest in weather patterns. In the last century and a half, wider implications of global temperature and trends and how they might impact the planet, wildlife and humanity have become more studied. Environmental science is the study of the effects of natural and unnatural processes, and of interactions of the physical components of the planet on the environment (particularly human action) (18). Environmental Science covers a number of disciplines: climatology, oceanography, atmospheric sciences, meteorology, and ecology. It also covers, while having much in common with biology, physics, geology and a lot of other older disciplines. Climate Change is studied under the modern discipline of Environmental Science, which is a branch of Earth Sciences or Atmospheric Sciences and crosses many boundaries, incorporating a wide variety of methods and tools.
19th Century Beginnings
We can trace the history of climate change in environmental science all the way back to the 19th century when there was first proposed the concepts of the “Ice Age” (1) and the “Greenhouse Effect”. Even as early as the 1820s, scientists understood the properties of certain gases and their ability to trap solar heat. Though both concepts took a while to finally gain acceptance, once the evidence was beyond dispute - that was when the scientific community began to ask the important questions: how did this happen? What made it happen? Why did the ice age end? Could it happen again? If so, how soon? The two theories were inextricably linked as researchers first began to propose the idea that lower levels of greenhouse gases in the atmosphere caused ice ages, and that higher levels led to the much warmer temperatures (2).
Even then, understanding the properties of what we today know to be greenhouse gases, scientists in a growing industrialized world first proposed the possibility that the world may eventually face a problem (despite that some proposed industrialization was a positive thing that would prevent the inevitable next ice age - whenever that may be). It was not something to worry about for the immediate future as at the rates the planet was consuming fossil fuels, experts calculated that it would take several thousand years to register significant warming and measurable effects on the climate. As the developed world expanded its industrialization through the 19th century, that figure was adjusted to several centuries (3).
Early 20th Century
The early part of the 20th century saw fierce criticism of the existing theories. Skeptics argued that the concept of global warming was too simplistic and had not taken into account local variations in weather - such as humidity (3). Flawed tests in the early 19th century were quickly thrown out and for a while most of the scientific field lost interest in the problem (12). It took until the 1930s for researchers to begin to see the problems that burning fossil fuels was having on the climate. There had been marked changes since the industrial revolution and between the wars, that fact was increasingly noticeable (4). However, the consensus was that we were entering a phase of natural warming and that the fossil fuels were not having a significant impact on the climate, dissenting voices were treated with skepticism and some (admittedly flawed) tests came back with mixed results (3). Only Guy Stewart Callendar said that the changes were anthropogenic. In his mind though, it was a positive thing and would merely delay the next ice age (5). He estimated that the following century would bring a rise of two degrees and recommended researchers take more interest in the data (2).
1940s - 1960s
In the 1940s, experts recorded a 1.3C increase in the temperature the North Atlantic since the end of the 19th century (6: p105); the conclusion was that the only known greenhouse gases then (carbon dioxide and water vapor) were responsible. Studies over the following decade confirmed this temperature rise and it would not be until the 1970s when the other greenhouses gases and their effects would be identified: CFCs, nitrous oxide and methane.
The dawn of the nuclear age in the 1950s and 1960s and the popular imagination's understanding of the damage such weapons could reek upon the planet, gave researchers the opportunity to study the decay of carbon 14 isotopes (3) in the atmosphere. This was, and is, fundamental to our understanding of recent climate change, particularly burning carbon sources. Radiocarbon-14 is used for dating organic objects from recent history - with a limit of up to around 50,000 years (7). The revelations of Rachel Carson's book Silent Spring brought to the public imagination the real effects we may have already had on our planet (8).
The 1950s was also the dawn of the computer era. This was fundamental to the growing interest in the climate. Most importantly, it analyzed each of the layers of the Earth's upper atmosphere (3) far more easily and put to rest the simplistic data and models of the early 20th century. This brought the first confirmation that the increasing levels of carbon dioxide would have a warming effect over time. Furthermore, there was increasing confirmation that doubling of the carbon levels would lead to a global average temperature increase of 3-4 degrees.
Modern Climate Change in Environmental Science: 1970s-1980s
By the time of the 1970s with so much data from a variety of disciplines such as paleontology, paleobotany, archeology and anthropology led to the understanding that the Earth's climate has always changed and what factors had been forcing it. These new disciplines that brought a wider scope of data sets meant that not only could we see that temperatures were rising, we could see potential consequences too - scientists began speaking of critical changes to the climate from the year 2000. Most tragically on popular opinion, a few fringe writers postulated on the possibility of a new Ice Age to arrive within the next few centuries - but even those experts said the data looked doubtful (3). The media ran with it and it fuelled for climate skepticism for the next several decades (p98: 6).
The international community - both governments and research bodies - were increasingly concerned about what effects our actions would have on the climate and the future of our planet. In 1972, the United Nations formed UNEP - the United Nations Environmental Programme following the first United Nations Conference on the Human Environment. They met in Stockhom, Sweden to tackle a range of environmental issues including climate change (9).
Paleodata from ice cores was now fundamental to researching how climate change would affect the global environment and ecosystems and it was during this period that researchers identified massive increase in greenhouse gases from the time of the industrial revolution (6), particularly of methane. The abundance of particles over the 19th century far exceeded all fluctuations of the previous half a million years.
The 1970s was also the era of chlorofluorocarbons (CFCs). They were found to be 10,000 times more effective at absorbing infrared radiation than carbon dioxide. Researchers also quickly discovered the devastating effect the chemical was having on the ozone layer - the layer of gas protecting the Earth from the sun's most harmful rays (13). Once these damaging properties were confirmed, the substance was banned. This had massive implications for the humble toiletries in our bathrooms as most aerosols contained the chemical. What's more, CFCs were identified as existing purely from industrial operations and did not exist in nature (p100: 6).
The evidence was mounting up and 1988 saw the first record hottest year on (until that point at least, several more would follow) (2) and the founding of the International Panel on Climate Change (IPCC) (10). By 1988 we knew that in order to maintain global temperatures, the planet had to radiate as much energy as it received from the sun; we also knew that there was increasingly an imbalance (2). In the same year, British Prime Minister Margaret Thatcher, whose qualifications were in chemistry, warned of the greenhouse gases being pumped into the atmosphere and warned about the effects they will have in future. She called for a global treaty to tackle the problem for future generations (11).
1990s - Now
The 1990s are considered by many as the “Golden Age” of environmental science. Amongst other things, many of the top climate science journals began in the late 1980s and it was in these years that the discipline truly began to take on board the widest range of data and methods from as broad a scope as possible. It was the dawn of climate modelling on which the IPCC published their reports. The First Assessment Report (FAR) came out in 1990 (p98: 6)
Following calls by the UN to act on carbon emissions, protocols set in Montreal and then London sought to phase out these substances most damaging to the environment. In the USA, the Clean Air Act Amendment of 1990 came into force (14) to tackle acid rain, ozone depletion, air pollution and a number of other environmental issues. About the same time, most countries in the western world took steps to introduce similar standards through legislation.
Most critically as far as the science was concerned, ice cores taken from the Antarctic demonstrated that temperature rises preceded the increase in ice levels - this put to rest once and for all the notion that the ice ages were fuelled purely and entirely by the fluctuations in carbon dioxide levels (2). Ice core data has proven extremely useful in monitoring paleoclimate data. Each layer of snow, each build-up of ice, even each season of ice is different in texture and composition thanks to natural fluctuations. When factoring in larger events, it is relatively straightforward to (when correlated with other data types) work out what the climate was like in any given season (15). This was crucial in taking study of the climate forward.
Now that the data was becoming more complex, it was time to understand and explain more complex systems, causes and effects. It is a common term now, but in the 1990s the “feedback loop” was a growing hypothesis. Data showed that around the ice ages were massive changes in the environment (p106: 6); each “forcing” (agent) has a different effect on the climate, sometimes positive and sometimes negative, and taking all of these into account required greater levels of modelling and regular fine tuning. Today, the models are considered highly accurate and if anything, the IPCC has underestimated the effects of the various forcing agents on the environment (17). Climate sensitivity is what it is all about and with millions of years of data to draw conclusions (2). What they found in the mid 2000s was startling: a doubling of the greenhouse gas levels always led to a global temperature increase of 3C in the past.
It is modelling that drives climate study today and though some of the early studies were contradictory, it wasn't until 2005 when researchers began to study the oceans that they could see and understand the full implications of greenhouse gas emissions (2). Land temperatures rise more rapidly than the oceans which explains why the northern hemisphere has recorded the greater temperature increases, it is distorted by the greater landmass. Overall, we now know that the global mean temperature has already risen 1.53F or 0.8C between 1880 and 2012 (16).
Since the year 2010, there has been a growing occurrence of freak weather incidents. California and Australia have experienced more intense bush fires, Europe has suffered floods and storms, there have been record droughts, record snowfall. November 2013 to February 2014 saw a polar vortex hitting North America, floods and storms in the UK and record warm temperatures in Siberia. Exceptional wind, record rainfall and a strong jet stream contributed to one of the worst seasons on record.
- Guide to Parasitology - November 19, 2018
- Deserts as Ecosystems and Why They Need Protecting - November 19, 2018
- Conservation: History and Future - September 14, 2018