Showing posts with label Climate Change. Show all posts
Showing posts with label Climate Change. Show all posts

Thursday, November 7, 2013

Is There a Role for Nuclear Power in Reducing CO2 Emissions?


Last Sunday, CNN was reporting (here) that top climate scientists had come out in favor nuclear power as a "realistic" way to reduce carbon emissions. Slate was also reporting (here) that while "Many in the environmental community say that renewable energy is not a viable solution to the climate problem," climate scientist James Hansen is saying that "...this is the equivalent of 'believing in the Easter Bunny and Tooth Fairy'".  CNN also plans to air a documentary on Thursday Nov. 7, Pandora's Promise (see the video clip below),  investigating the resurgence of interest in nuclear power.

The media always seem to find black-and-white stories (Nuclear Power vs. Climate Catastrophe or Renewable Energy vs. Climate Catastrophe) compelling, but these are false choices. Almost a decade ago, Stephen Pacala and Robert Socolow from Princeton University began arguing that there are a wide range of technologies that could be applied to reducing carbon emissions. For example, see the 2004 article in Science Stabilizing Wedges: Solving the Climate Problem in the Next 50 Years with Current Technologies and the 2006 article in Scientific American A Plan To Keep Carbon in Check. Their point might be too boring for a media special but its important to understand and keep in mind as the spin cycle starts winding up.

Pacala and Socolow argue for a divide and conquer strategy (see the graphic above, click to enlarge). By 2056, carbon emissions are forecast to double from the current level of about  7 billion tons per year to over 14 billion tons per year. If we want to stabilize carbon emissions, no single technology, be it renewable energy or nuclear power, will be able to do that. However, seven technologies that each reduce carbon emissions by 2 billion tons per year by 2056 would stabilize carbon emissions. Pacala and Socolow go on to describe 15 current technologies (no pie in the sky) that are capable of each contributing 2 billion tons per year. One of the technologies involves doubling today's nuclear power output to displace coal-fired power plants.

What are the other technologies? Here's the list, including nuclear:

  1. Increase the fuel economy of 2 billion cars from 30 to 60 mpg.
  2. Drive two billion cars not 10,00 miles per year (current average) but 5,000 miles per year at 30 mpg.
  3. Cut electricity use in homes, offices and stores by 25 percent.
  4. Raise efficiency at 1,600 large coals-fired power plants from 40 to 60 percent.
  5. Replace 1,400 large coal-fired power plants with gas-fired plant.
  6. Install Carbon Capture and Storage (CCS) at 800 large coal-fired power plants.
  7. Install CCS at coal fired power plants that produce hydrogen for fueling 1.5 billion vehicles.
  8. Install CCS at coal-to-syngas plants.
  9. Add twice today's nuclear output to displace coal-fired power plants.
  10. Increase wind power 40-fold to displace coal.
  11. Increase solar power 700-fold to displace coal.
  12. Increase wind power 80-fold to make hydrogen for cars
  13. Drive two billion cares on ethanol using one-sixth of the world's cropland.
  14. Stop all deforestation.
  15. Expand conservation tillage to 100 percent of cropland.

So while we are all watching CNN's documentary or reading about the conversion of climate scientists, keep in mind that we have a large menu of (boring) choices at least half of which have to be deployed between now and 2056 if CO2 emissions are to be stabilized.




MORE READING

Stabilizing Wedges: Solving the Climate Problem in the Next 50 Years with Current Technologies

A Plan To Keep Carbon in Check

Amy Leuers, Director, Climate Change at the Skoll Global Threats Fund (blogged about here) tweeted (here) "At least we must face the tough tradeoffs, not doing so is a form of denialism."  

Slate The Pro-Nukes Environmental Movement

CNN Top Climate Change Scientists' Letter to Policy Influencers and Pandora's Promise

Sunday, November 11, 2012

Adding Economic Causes to the Hurricane Intensity Model


Today on Fareed Zakaria GPS, economist Jeff Sachs of Columbia University added some economic detail to the earlier hurricane intensity model developed by Kerry Emanuel of MIT (here). Sachs inserted economic growth in the model and argued that glacier melting and coastal development increase hurricane impacts with higher sea level and more homes built on prime development land near the Ocean. The causal model above puts the two arguments together (click to enlarge) and shows the multiple positive impacts of economic growth on hurricane intensity and storm damage.

TECHNICAL NOTE: When reading causes off directed graphs, be aware that there are parameters associated with each arrow (discussed below) and that the parameters have either implicit positive signs or explicit negative signs. When you travel down a path, the parameters are multiplied together to determine the amount and direction of causation. For example, from CO2 Emissions -> Flooding in the graph above there are two negative signs that get multiplied together to create a positive effect.

The role of the parameters can be seen from simpler Impact Models  (e.g., the IPAT equation, the Kaya Identity and the Environmental Kuznets Curve);

In Impact Models, population growth (N) leads to economic growth (Q) leads to Energy use (E) leads to  CO2 emissions (C). By the rules of path analysis, N(qec) -> C where q is productivity (output per capita), e is Energy Efficiency (E/Q) and c is Emission Intensity (C/E). These can be influenced by technological change. The effect of population growth on CO2 emissions then depends on whether technological change increases more rapidly than the population growth rate.

Saturday, November 10, 2012

Why Climate Change Will Lead To More Intense Hurricanes.



Here's an unbiased explanation from someone other than a Fox News Pundit (it's from Kerry Emanuel of MIT) about the causal link between global temperature increase and hurricane impacts. The causal modeling behind Prof. Emanuel's discussion is presented below.


Tuesday, May 1, 2012

Climate Feedbacks and Climate Change Denial

The NY Times recently ran an article titled Clouds' Effect on Climate Change Is Last Bastion for Dissenters. What was interesting to me about the article was not only the right wing's reasoning behind climate denial but also the rather sophisticated appeal to climate change feedbacks as a reason not to worry about CO2 emissions. We've come a long way from arguing that GHG emissions don't cause global warming to the "last bastion" of climate change denial, the Iris Effect proposed by Richard Lindzen.

As the directed graph above shows, the right wing has now conceded that CO2 emissions increase global temperature. However, Lindzen argues that warming will increase rain at the equator, depriving cirrus clouds of the moisture necessary for their formation. Since cirrus clouds have the effect of warming the Earth by preventing heat from escaping to space, fewer cirrus clouds could mean a cooler Earth as the Iris opens.

Unfortunately, there is no data to support Lindzen's arguments. Although the feedback effect might exist, it is either (1) too weak to deal with the massive amount of CO2 that is being pumped into the atmosphere as a result of fossil fuel burning or (2) actually a positive loop.

The good news is that, supposedly, this is the right wing's last best argument. The bad news is that we're probably going back to one of the old irrational arguments.

Saturday, October 1, 2011

Right Wing Fixation on Fertilization

The NY Times ran a comprehensive article today (here) that explores the important role of the world's forests in controlling CO2 emissions and global temperature. The causal diagram below summarizes the article (click to enlarge).
CO2 emissions from fossil fuel burning enter the atmosphere where the greenhouse effect increases global temperature. At the same time, atmospheric CO2 concentrations are absorbed (Co2 sequestration) by the oceans and by the forests. Co2 fertilization increases the growth rate of the forests, but wild fires, insect infestations and water deficits created by global warming decrease forest biomass as does outright deforestation and poor forest management techniques.

The article details how the right wing has latched on to C02 fertilization to argue that global warming (if it really exists) will benefit the planet. Unfortunately, the forces reducing forest growth are overwhelming the CO2 fertilization effect.

In addition to being sinks for carbon emissions, the forests and the oceans provide biodiversity (fish, animals and plants) that are threatened by ocean acidification and forest die-off. The article concludes that we cannot count on natural feedback effects to control climate e.g., there are limits to how many trees we can plant on the available land as a way to absorb Co2 emissions. The only option is to reduce CO2 emissions.

Monday, June 27, 2011

For Those Who Have Lost Their Way: The Wayfinders


I just finished reading the last chapter of "The Wayfinders: Why Ancient Wisdom Matters in the Modern World." If you are convinced of the superiority of Western Civilization and Culture, the book will challenge your thinking. You can read excerpts from the final chapter "Century of the Wind" here.

The video clip above is followed by another clip (somewhat poorly recorded) "Climate Change: Pessimism is a Luxury".

Thursday, January 20, 2011

Integrated Assessment Models: PNNL GCAM



The IPCC is in the process of developing the next generation of emission scenarios. The scenarios are used to generate anthropogenic radiative forcings that drive Global Circulation Models. The so-called Integrated Assessment Models (IAMs) are being developed by government agencies and research laboratories in the U.S., the Netherlands, Japan and Austria. It is worth trying to understand and simplify these models because they are critical to conclusions the IPCC is drawing about climate change.

In the U.S., the Pacific Northwest National Laboratory (one of US DOE's ten national laboratories) has developed the Global Change Assessment Model (PNNL GCAM) which will be used to generate scenarios for the next release of IPCC documentation.

The PNL GCAM model is essentially the same partial equilibrium model you saw in introductory economics textbooks. Basically what these models (and their big brothers, general equilibrium models) do is compute long-run equilibrium prices. Partial equilibrium models look at one market (energy) while general equilibrium models look at all prices. Actual prices are assumed to be deviations from rationally determined long-run prices.
In partial equilibrium models, population growth, technology and gross domestic product are given exogenously. All the model does is calculate energy prices. The existing capital stock is assumed fixed.

It's hard to understand how a partial equilibrium model would be of much use in studying global climate change. Wouldn't there be impacts on food prices from oil prices and fertilizer prices? Wouldn't food prices have an impact on population and production? Wouldn't energy prices have an impact on technology and the existing capital stock e.g., replacing gasoline with electric vehicles?

These models were originally used to help the DOE study energy demand. They were recruited by the IPCC for the study of climate change.

PNNL also has a computable general equilibrium (CGE) model or Second Generation Model (SGM) that is described here. I'll talk about that model in a future post. CGE models are not necessarily an improvement.

Friday, January 14, 2011

Markets and Climate Change: One Analyst's View


Top Energy Fund Manager Questions Global Warming @ Yahoo! Video This analyst thinks that global temperature has been "cooling" over the last decade even though 2005 and 2010 were supposedly the hottest years on record (see my last post here). Is all this confusing enough or what?

Interesting Quotes:

"What we're looking at is alternative energy's economic at the present time and ... the only thing that makes sense is wind."

"The question is, are we going to get some kind of carbon tax that levels the playing field ... The cap-and-trade system we currently have doesn't do anything."


Thursday, January 13, 2011

Hottest Years on Record: 2005 and 2010


Yesterday, the NY Times reported (here) that new data shows that 2010 was tied with 2005 as the two hottest years on record. It's useful to compare the new data to the revised forecasts from the IPCC "Climate Change 2007: Synthesis Report" (here) and to my own forecasts presented below.

The IPCC projections (forecasts) are based on emission scenarios. There are four basic scenarios generated from assumptions about globalization and economic growth. The A1 scenario is high globalization and high economic growth. The B1 scenario is high globalization and sustainable development. The A2 scenario is based on low globalization and high economic growth. Finally, the B2 scenario is based on low globalization and sustainable development. The scenarios produce different GHG emissions (left panel above, click to enlarged) and different global warming paths (right panel). The global warming projections range from about 1 degree C to over 3 degrees C with error bars over 6 degrees C (the error bars for the scenarios are on the far right of the graphic).
There are a number of alternative global temperature forecasts on the web (here and here, for example). The projections are typically based on single-equation statistical approaches. The basic equation is T = sF +V where T is global temperature, F is radiative forcing from GHG emissions or natural sources, V is natural variability and s is the climate sensitivity parameter. The increase in temperature is a direct result of forcing and climate sensitivity with some natural or cyclical variability thrown in.

My approach is based on state-space models that treat the environment and the economy as two interacting complex systems. The state of the environment is affected by the state of the economy and vice versa. Since the two systems can and do have negative impacts on each other, growth and global temperature peak over time. Notice that although the confidence intervals look wide, the top 98% confidence interval never hits the 2 degree C warming threshold considered critical by the IPCC.

The difference in projections is basically the result of the assumptions underlying the IPCC emission scenarios (the left graphic above). I'll go into more detail about and criticism of the emission scenarios (which are based on neoclassical economic growth theory) in a future post. My projections should not be taken as support for global warming denial. The models predict warming. What the world will look like after peak global warming, the models cannot predict.

Saturday, January 1, 2011

Top Performing Companies Welcome Environmental Regulation

Yesterday on CNBC (video here), Tim Solso, CEO of Cummins, Inc. (CMI) was interviewed on Fast Money. Cummins is one of the year's top S&P performers. Mr. Solso had some very interesting things to say about environmental regulation.

"In the 1990's, we saw regulation as a challenge and a problem. But [...now...] we think we're the technical leaders. We invest in key technologies... The tougher the emissions standards and the faster they're implemented, [sic] gives us an advantage. It's a barrier to entry for other engine manufacturers ... Emission regulations are going all over the world ... We're already starting to get ready for the 2014 CO2 regulations with better fuel economy which will benefit consumers. Regulations are a good thing for us and it's a good thing for clean air and a clean environment."

The quote demonstrates that the top performing companies welcome regulation. It's their weaker rivals that are the first to seek regulatory relief. And, the purpose of Capitalism is to sort out and eliminate the weak competitors. Smart environmental regulation is an essential part of the process. Next time you hear politicians complaining about the effects of regulation on "small business," remember that they really mean "weak" businesses.

Back to Cummins, they're stock price history is displayed in the first image (above) along with step-ahead model predictions. The predictions are based on a best-fit model and, in this case, the best fit model is based on secular and cyclical trends in the US economy (the USL20E model). Unlike GM (here), the Cummins stock price is not a random walk.
Over time, there have been periods where Cummins stock was both over- or under-valued. The graph above plots the USL20E model predictions without external shocks, that is, the equilibrium position for Cummins stock price. Right now, at the end of 2010, Cummins stock is about at its equilibrium value.
For the future, the model predicts (above) that Cummins will have a pretty good run at least until 2015. However, there is a lot of variability in the prediction (the dotted lines are the upper and lower 98% prediction intervals), so there is plenty of both potential upside gain and downside loss if you're interested.

Sunday, December 26, 2010

Global Warming: Why Are The Winters Getting Colder?

In today's NY Times, seasonal weather forecaster Judah Cohen explains (here) how the Earth system can warm at the same time that winters in the Northern Hemisphere (NH) become cooler. Cohen also points out some problems in long-term weather forecasts and the Global Circulation Models (GCMs) used to predict climate change.

Here's the causal explanation (see the directed graph on the right): Global warming increases Arctic sea ice melt. As the sea ice melts, more moisture is released into the atmosphere. More moisture means that there will be more snow in Siberia. As snow cover increases, more energy is reflected back to space (the Earth's albedo or "whiteness" increases and white objects reflect more energy).

As more energy is reflected back to space, an Arctic cold air dome forms over Siberia. The large dome of cold air shifts the jet stream from its normal West-to-East direction to a more North-South oscillation. As the Jet stream moves from North to South it acquires Southern moisture and pulls down Northern Canadian cold air (in the US).
The predictions from Cohen's model for the U.S. (displayed at right) show that the Northeastern U.S. was predicted to be colder than usual while the Southern U.S. was predicted to be warmer. The actual trends (lower graphic) were very close to the model's predictions.

Long-term weather forecasts are largely based on the El Nino/La Nina-Southern Oscillation (ENSO). Warming or cooling of the tropical Pacific Ocean on a five-year cycle cause weather disturbances for the entire planet. Since the oscillation is quasi-deterministic and since the effects of the oscillation on weather are known from historical data, long-term weather prediction is possible. The current long-term forecasts do not take into account Siberian snow fall and neither do the GCMs that are used to predict global warming. We can expect some improvements in forecasting and climate change predictions as ENSO and the Arctic snow cycle are better understood.

Thursday, December 23, 2010

EIA Projects Climate Catastrophe?


The US Energy Information Agency (EIA) published an early release of the 2011 Annual Energy Outlook (here). In the report, the EIA projects that energy-related CO2 emissions will grow by 16% from 2009 to 2035 to a level of 6.3 billion metric tons of carbon dioxide equivalent (1.7 GtC0). The non-skeptic climate blogs (here and here) are calling the projections a "catastrophe," except that the EIA projections are usually wrong (the EIA's own evaluations of their forecasts are here): (1) they assume the future will be like the past, (2) they don't model policy changes, (3) they underestimate the role of technology (reductions in emission intensity), and (4) have ignored the possible effects of peak oil.

To this list, I would add that the EIA published no confidence intervals with the projection (see my forecast with confidence intervals here). In a well-constructed model, the confidence intervals account for the probabilistic effects of unanticipated changes in the future. The factors that might avert catastrophe and can be anticipated, should be built into models (a brief and completely inadequate discussion of the IEO2010 models can be found here).

Even with anticipated future changes that might reduce carbon emissions, solutions based on policy wedges (here) require a "...staggering amount of effort by both private and public sectors" if we are to keep the economy growing while at the same time reducing carbon emissions. Reducing economic growth rates, which clearly in both the EIA and my own projections did happen as a result of the global financial crisis, would provide the breathing room to implement policy wedges.

Unfortunately, our only thought right now is to get out of the global financial crisis and return to a level of "robust economic growth" and, as a result, robust CO2 emissions.

Tuesday, December 21, 2010

Controlling Carbon Emissions

On November 10, 2010 Nature published an article updating CO2 emissions for the world. The article noted that "global CO2 emissions from fossil fuel burning decreased by 1.3% in 2009 owing to the global financial and economic crisis that started in 2008; this is half the decrease anticipated a year ago." In other words, if there is any doubt about the link between CO2 emissions and economic growth, the global financial crisis provided a natural experiment proving the link. It's very difficult (OK, impossible) to run experiments on the world system, so the result is an important finding.

The study goes on to note that once the global financial crisis is over, the economy is expected to resume growing (the IMF, here, expects the global economy to grow by 4.8% in 2010) and emitting at the same pace. The only hope for reducing carbon emissions then is to reduce the carbon intensity of the global economy, that is, quickly shift to low-carbon forms of energy (solar, wind, nuclear, etc.). To me, the shift seems unlikely (cars, buses, trucks and trains are unlikely to run on low-carbon fuel any time in the near future--even all-electric cars will run on energy from coal-fired power plants).

But, the global financial crisis may have been a blessing in disguise, at least for climate change. The time series graph above (the y-axis is CO2 emissions in PgC per year for fossil fuel burning and cement manufacturing) takes the new emission data from the Nature study and forecasts it out to 2020 assuming that the world economy grows by 4%. Although the financial bubble and collapse are clear from the data (and are predicted quite well by the model), the future growth of emission is relatively flat. Small reductions in economic growth would go a long way to stabilizing CO2 emissions. Experience with the financial bubble just might lead to more modest growth than the IMF anticipates. And, slower growth would provide some breathing room to reduce the carbon intensity of the global economy.

The Nature article also has updated analysis of the global carbon cycle. I'll analyze some of that in future posts.

Saturday, December 4, 2010

A Scientific Basis for Policy?

Wisconsin's newly elected Republican governor, Scott Walker, recently commented that


The phrase that caught my eye was "...ensuring decisions are based on objective science..." I wonder what that means? Walker, along with a lot of other Republican Governors, refers to the IPCC's attempt to present the scientific consensus on global warming as the work of "discredited scientists." Really? Walker wants to ban stem cell research conducted by, guess who, scientists. Walker vows to cancel a high-speed rail project between Milwaukee and Minneapolis, even though political scientists think cancellation will obviously take jobs out of the Wisconsin economy (Walker ran on a pro-growth platform claiming that he will create 10,000 business and 250,000 jobs over the next four years, really).

Up to this point, our governor-elect has proved himself a master of Orwellian doublethink. He has also promised to reorganize Wisconsin government. I wonder whether this will involve creating a Ministry of Truth?

Monday, September 13, 2010

Russia is Burning



Shown above is a prediction for the year 2065 from the NOAA/GFDL CM2.1 Climate Model. Most of the World's land masses are predicted to be well above average temperatures for 1971-2000. What might be the consequences of global temperature change at this level?

Consider what is happening right now in Russia: This year, Russia recorded the hottest day (100 degrees) since record-keeping began in 1880. Wild fires continue to burn out of control. Smoke from the fires has substantially increased air pollution in Moscow. Russia's state environmental agency concluded in 2008 that Russia was warming twice as fast as the rest of the world. As a result of the heat and drought, Russia has banned the export of wheat while US wheat exports are booming.

Here's the causal model:


Positive signs on arrows are implied while two negative signs in a row (drought reducing the Russian wheat harvest which is negatively related to the US wheat harvest) are read as a positive impact (as discussed last week in Global Change). For good measure, throw in the floods in Pakistan (more energy in the atmosphere generating stronger monsoons).

Right now, the environment (especially climate change) seems to no longer be an issue in Washington yet, Russia is burning. In 2065, the NOAA model predictions for the US are not reassuring, but that's 55 years from now and elections are in less than two months.

The Russian ban on wheat exports is also creating the potential for a world-wide food crisis. The NY Times, in an editorial, asks that Russia learn from the last food crisis caused in part by demand for biofuels (another environmental issue) and not pursue "misguided polices". Imagine the rationality of policy making if world temperature increases by a few more degrees by 2065.

Tuesday, January 26, 2010

Science as System

In the current global warming debate it would be helpful to have an idea about how the science system works when trying to evaluate the widely publicized outputs. Here is a meta-model discussed today in the Principles of Environmental Science.

Science is fundamentally about developing models. Usually the models are expressed mathematically or as computer code or both. Model development, however, is just the first step. The model must be tested against reality and revised if it doesn't fit the data.

After enough independent tests, revisions and validations, a model can be generalized into a theory (a model along with all its tests and revisions). Theories, so defined, accumulate our base of knowledge (scientia).

Is this the way it always happens? Not exactly! Many models in economics and system dynamics skip from model directly to theory and received wisdom. For example, consider William Norhaus' DICE model and the Limits to Growth models. I have been unable to find anything other than casual attempts to relate the model outputs to historical data.

Another approach in the global warming debate is to present tests without either a model or a theory. The work of Bjorn Lomborg provides a notorious example. Simple historical time plots are presented to "disprove" the global warming assertion and argue that things are actually getting better.

Compared to the two approaches above, the Intergovernmental Panel on Climate Change (IPCC) is doing science. The assessment and technical reports are essentially presentations of models along with results from testing. For some technical areas, there are no models. For some other technical areas, there are few tests. For the Integrated Assessment Models used to develop the family of Emissions Scenarios, it's sometimes hard to tell the status of the models. The testing probably needs improvement. Possibly that's why the IPCC publishes scenarios rather than forecasts with probability assessments. It's an important area that needs more scientific attention.

Tuesday, December 8, 2009

Stabilizing Emissions with Policy Wedges

Robert Socolow and Stephen Pacala have a plan to keep carbon emissions in check (as discussed in today's Global Warming Debate). The plan is based on a divide-and-conquer strategy: divide the total emission reductions needed into manageable pieces ("wedges") and propose existing technologies to tackle each wedge.

Total carbon emissions are forecast to be 14 billion tons a year by 2056. To get back to the 2006 level of 7 billion tons per year you need seven billion-ton-a-year wedges for the next fifty years. Pacala and Sokolow actually propose 15 wedges covering end-user efficiency and conservation, power generation, carbon capture and storage (CCS), alternative energy sources and agriculture and forestry--an ample menu of existing technologies to choose from. And, after 2056, we can implement another 3 wedges to get us back to 4 billion tons a year which is around the amount of carbon that the existing earth systems can effectively absorb.

Essentially, Pacala and Sokolow take the Emission Equation and focus on carbon intensity, energy intensity and population growth (it could be one wedge if reduced) while leaving output per capita (economic growth) alone. This is a very attractive formulation (it's even been applied to controlling the US health care system). Electric cars, wind turbines, solar panels, new CCS coal-fired power plants, super-insulated homes, etc. all create economic growth and improve our standard of living.

There is even a Stabilization Wedge Game that can be used as a teaching tool. Actually, the game has drawn more criticism than the scientific articles: the costs are underestimated, the implementation time is underestimated and the demand side (population growth and economic growth) is ignored in favor of technological "fixes".

My problem with Stabilization Wedges is that they ignore the systemic aspects of the environment. Carbon is not the only problem facing the world system. Demand has increased our ecological footprint beyond sustainable levels and there is no quick technical fix for creating more ecological capacity than our current Earth system can provide. We will need both supply and demand solutions.

Tuesday, December 1, 2009

Carbon Accounting and Policy in Copenhagen

The challenge for policy makers at the upcoming (Dec 7 - Dec 18) UN Climate Change Conference in Copenhagen can be seen from some simple carbon accounting and one equation (as discussed today in the Global Warming Debate).


The Figure above (from IPCC 2007 WG1 Ch. 7 Fig. 7.3) displays the estimated carbon cycle for the 1990's. What's important to notice is the equilibrium flows (up- and down-arrows). The black arrows indicate pre-industrial "natural" fluxes and the red arrows indicate "anthropogenic" (man-made) fluxes. The question here is how much carbon will the biosphere and the oceans absorb relative to how much is emitted. If you add together all the net fluxes for Weathering, Respiration, Land, and Oceans you get 4.4 GtC/yr of carbon absorbed by the biosphere and the oceans. Of course, notice the 6.4 GtC/yr unbalanced emission from fossil fuels.
To say it another way, the biosphere and the oceans are capable of absorbing about 4.4 GtC/yr (+/- 20%). The figure above shows the actual world carbon emissions from 1950 to 2010. In 2008, we emitted 8.59 GtC/yr which is about twice the absorptive capacity of the biosphere and oceans. In other words, sometime in the 1970's or early 1980's the world's carbon cycle went out of equilibrium. Ultimately, that equilibrium has to be restored.

The "Emissions Equations" shows our policy choices:

CO2 = (N) x (Q/N) x (E/Q) x (CO2/E)

(CO2 Emissions) = (Population) x (Output per capita) x (Energy Production per capita) x (Carbon Intensity)

(CO2 Emissions) = (Population) x (Output per capita) x (Energy Intensity) x (Carbon Intensity)

If we want to control CO2 emissions we can (1) reduce population growth, (2) reduce output per capita, (3) reduce energy intensity or (4) reduce carbon intensity. Since reducing population growth is off the table (only China has tried population control) and since reducing output per capita (economic growth) is off the table, we are left with the technical challenges of reducing energy intensity (heavily insulated buildings, electric cars, mass transit, etc.) and reducing carbon intensity using renewable energy sources (solar, wind, geothermal, etc.)
If only it was a "simple" as totally changing out our existing energy systems. The figure above shows a plot of the World's Ecological Footprint (EF), a measure of human demand on all the Earth's ecosystems. The figure above is calculated as a ratio of the number of Earths needed to support human demand over the number of earths actually available (one Earth). Interestingly enough, we exceeded the Earth's ability to meet human demands at about the same time in the 1970's that we exceeded the ability of the biosphere and the oceans to absorb our carbon emissions.

There is no immediate, quick fix technical solution (heavily insulated buildings, electric cars, etc.) for the EF problem. However, both the EF and the carbon cycle have to be brought back into equilibrium. To be successful in Copenhagen, policy makers will have to find a way to take us back the the 1970's in terms of our carbon emissions and our demands on the Earth's ecosystems. With population growth and economic growth taken off the table, they don't have a chance.

Wednesday, November 18, 2009

A Lens on Climate Change

Had enough charts and box diagrams explaining climate change? Here's a great video site, Consequences by the NOOR climate change project (scroll down the page for video and photography about the pine needle infestation in British Columbia; sea-level rise in the Maldives; the burning coal fields of Jharia, India; nomadic Nenet tribes under threat from global warming; the Canadian oil tar sands; Somalia's environmental refugees; and more).

Thursday, November 12, 2009

An Inconvenient Half-Truth

Al Gore's documentary and associated book, A Inconvenient Truth, has been the subject of intense controversy and has stood up quite well to the attacks. But, nothing is perfect and here's one example discussed this week in the Global Warming Debate.

From the book (page 196): "If Greenland melted or broke up and slipped into the sea--or if half of Greenland and half of Antarctica melted or broke up and slipped into the sea, sea levels worldwide would increase by between 18 and 20 feet." This is a true statement and if it happened the World Trade Center Memorial would be under water.

The assertion, however, begs the question of (1) how likely is a sea-level rise of 18 to 20 feet and (2) how likely would such a sea level rise result from the melting of the ice sheets in Greenland or Antarctica? The Intergovernmental Panel on Climate Change (IPCC) has the answer buried in its reports.
Sea-level rise (SLR) is forecast to increase by about one-half foot every 100 years. In other words, sea level is expected to rise by 20 feet in the year 6000. What are the most likely sources of SLR?
This graphic is a little more difficult to explain since there are different measurements from two periods (Blue=1961-2003 and Brown=1993-2003). The important point to notice is that for either period, thermal expansion and glacier melt are the two most important sources of SLR. The melting of Greenland and Antarctica, given the error bars, provides almost a zero contribution.

This is not to say that we should not be concerned about the Greenland ice sheet or that there might not be a tipping point where the rate of melting gets accelerated. It's just that it's pretty far off in the future.