Skip to main content

AGU 2013: The importance of 400 ppm CO2

AGU 2013
On 1 June 2012, a concentration of 400 ppm carbon dioxide was measured in air samples in the Arctic.  On 9 May 2013, Mauna Loa, the longest recording station, measured a daily average of 400 ppm carbon dioxide. Next year we may see the global average concentration reach 400 ppm and the year after that 400 ppm could be measured at the South Pole. The 400 ppm number is arbitrary, but it is a symbol of the anthropogenic climate change that scientists have been talking about for many years.

Here at the University of Bristol, the upcoming 400 ppm epoch prompted the question of what do we know about 400 ppm CO2 climates and how  could it be used to galvanize action on climate change?  But 400 ppm and climate change is a bigger issue than one University can take on, so we took our idea to the American Geosciences Union Fall conference.  With more than 24,000 attendees each year, AGU is the perfect place to talk about what 400 ppm CO2 means in a scientific sense and what we as scientists should do about communicating it.

Two sessions were proposed: one looking at the science of 400 ppm CO2 climates, co-convened with Kerim Nisanciouglu of the University of Bergen, Norway, the other at communicating 400 ppm co-convened with Kent Peacock of University of Lethbridge and Casey Brown of UMass Amherst.

Naomi Oreskes (pictured) asked why scientists
don't allow themselves to sound alarmed when reporting alarming conclusions from their
The communication session looked at how climate science could be communicated effectively.  First to speak was Naomi Oreskes, who asked why scientists don’t allow ourselves to sound alarmed when we’re reporting alarming conclusions. Citing from neuroscience research, Oreskes argued that when scientists conform to the ‘unemotional scientist’ paradigm they actually risk being less rational and sounding inauthentic.  It was clear that Oreskes’ points struck the audience, as many of them queued up to ask questions.

Myles Allen made a compelling case for sequestered adequate fraction of extracted (SAFE) carbon – i.e. compulsory carbon capture and storage. Allen pointed out that people will always pay to burn carbon and argued that a carbon price is just a way to ‘cook the planet slower’.  Robert Lempert took a less controversial stand and explained how uncertainty can be managed in robust decision making.  Using hydrological examples, Lempert suggested that by starting with the desired outcome and working backwards, uncertainty can be dealt with.  The session finished with James Hansen, talking about the right message, and how the things that people care about needs to be communicated by the best communicators.  Criticising the pursuit of unconventional fossil fuels, Hansen argued the need for a carbon tax which was redistributed back to people.  A lively question and answer session followed, with all the speakers engaging in a strong discussion and the audience contributing pointed questions. No problems with talking without emotion in this session!

The 400 ppm physical science session started by focussing on what information we could draw from climates in the past where CO2 is believed to have been ~400 ppm. The first speaker was Alan Haywood who summarised the work of the PlioMIP project which tries to understand the climate of the Pliocene (~3 million years ago) – what it was like and why.  The Pliocene is the most recent time period in the past when atmospheric CO2 concentrations could have been as high as they are today.  Two more Pliocene presentations followed.  First, Natalie Burls (standing in for Chris Brierley) explained that even with CO2 set to 400 ppm in their climate model simulations they could not match the warm temperatures reconstructed by Pliocene data – suggesting that either the climate models are not sensitive enough to CO2 or that there are other dynamical processes that we do not fully understand yet.  Thomas Chalk gave a comparison between different methods for reconstructing CO2 in the past, and concluded that the Pliocene concentration was indeed at around 400 ppm. The final talk in the palaeoclimate part of the session was given by Dana Royer who presented the most compelling evidence for very different climates in the past with polar forests at 80°N indicating annual mean temperatures in the Arctic that were 30°C warmer than they are today!  Dana presented new CO2 reconstructions demonstrating that the CO2 concentration at the time of the polar forests could have been around 400 ppm, again suggesting that our climate models may not be sensitive enough to CO2.

The next part of the session looked at current CO2 levels with a presentation by Steven Davis about the amount of CO2 that we have already committed to putting into the atmosphere. The energy infrastructure that we have already built amounts to future CO2 emissions of 318Gt, and new global commitments are still increasing. Vaughan Pratt followed with a talk about the reasons for the recent pause in the global warming trend, separating out natural causes and anthropogenic causes using mathematical and statistical analyses. He concludes that the recent pause is of natural origin.

The final part of the session peered through the looking glass into the future.  Andrew Friedman investigates the causes of the temperature asymmetry between the northern hemisphere and the southern hemisphere and how that asymmetry may alter under the future climate emission scenarios.  He concluded that the asymmetry is set to increase into the next century, with the northern hemisphere warming faster than the southern hemisphere and projects that the tropical rainbelt will shift northwards as a result.

Kirsten Zickfield has found that warming in the next
millenium might amount to 1 degree globally,
concentrated at the Poles.  Sea levels are projected to
rise by 0.8m.
The final talk of the session was given by Kirsten Zickfeld who examined the climate changes we might already be committed to as a result of the CO2 emissions we have already released (under the assumption that atmospheric CO2 stays at 400 ppm). She used a climate model with biogeochemical components to identify how long it would take for the climate to reach equilibrium with the present CO2 concentration of 400 ppm, what the climatic impacts of that equilibrium might be and whether it might be possible to return to CO2 levels below 400 ppm on human timescales by using negative emissions (carbon capture/storage schemes). She found that the already committed warming into the next millennium might amount to 1°C globally, concentrated at the poles. Sea levels are projected to rise by 0.8m due to thermal expansion alone and further increases of 10m due to ice melt are possible over much longer timescales. Committed changes for the ‘other CO2 problem’ - ocean acidification - are relatively small, with a pH drop of only 0.01 projected. She concludes that even if CO2 levels could drop below 400 ppm in the future, whilst air temperatures may stabilise, sea level may continue to rise due to thermal expansion alone.

Both of the sessions were recorded for access after the event and provoked a lot of debate, during the sessions and online.  We hope that in some small way these sessions have helped scientists think differently about what 400 ppm means and what we can do about it.

This blog was written by T Davies-Barnard and Catherine Bradshaw, Geographical Sciences, University of Bristol.
T Davies-Barnard
Catherine Bradshaw


Popular posts from this blog

Bristol Future’s magical places: Sustainability through the eyes of the community

“What is science? Why do we do it?”. I ask these questions to my students a lot, in fact, I spend a lot of time asking myself the same thing.

And of course, as much as philosophy of science has thankfully graced us with a lot of scholars, academics and researchers who have discussed, and even provided answers to these questions, sometimes, when you are buried under piles of papers, staring at your screen for hours and hours on end, it doesn’t feel very science-y, does it?

 As a child I always imagined the scientist constantly surrounded by super cool things like the towers around Nicola Tesla, or Cousteau being surrounded by all those underwater wonders. Reality though, as it often does, may significantly differ from your early life expectations. I should have guessed that Ts and Cs would apply… Because there is nothing magnificent about looking for that one bug in your code that made your entire run plot the earth inside out and upside down, at least not for me.

I know for myself, I…

Will July’s heat become the new normal?

For the past month, Europe has experienced a significant heatwave, with both high temperatures and low levels of rainfall, especially in the North. Over this period, we’ve seen a rise in heat-related deaths in major cities, wildfires in Greece, Spain and Portugal, and a distinct ‘browning’ of the European landscape visible from space.

As we sit sweltering in our offices, the question on everyone’s lips seems to be “are we going to keep experiencing heatwaves like this as the climate changes?” or, to put it another way, “Is this heat the new norm?”

Leo Hickman, Ed Hawkins, and others, have spurred a great deal of social media interest with posts highlighting how climate events that are currently considered ‘extreme’, will at some point be called ‘typical’ as the climate evolves.
In January 2007, the BBC aired a special programme presented by Sir David Attenborough called "Climate Change - Britain Under Threat".

It included this imagined weather forecast for a "typical s…

The Diamond Battery – your ideas for future energy generation

On Friday 25th November, at the Cabot Institute Annual Lecture, a new energy technology was unveiled that uses diamonds to generate electricity from nuclear waste. Researchers at the University of Bristol, led by Prof. Tom Scott, have created a prototype battery that incorporates radioactive Nickel-63 into a diamond, which is then able to generate a small electrical current.
Details of this technology can be found in our official press release here:
Despite the low power of the batteries (relative to current technologies), they could have an exceptionally long lifespan, taking 5730 years to reach 50% battery power. Because of this, Professor Tom Scott explains:
“We envision these batteries to be used in situations where it is not feasible to charge or replace conventional batteries. Obvious applications would be in low-power electrical devices where long life of the energy source is needed, such as pacemakers, satellite…