A warmer world will have severe consequences for global agriculture, slowing yield-growth even as populations increase at alarming rates.Read More
One of the central problems in understanding climate change is understanding how it will impact the economy. Most current models assume that climate change will reduce usable output. The resulting effects on long term growth under such an assumption are small. If usable output is reduced, there will likely be some reduction in savings and hence capital available in future periods. Nevertheless, the core drivers of growth such as technological development remain. As a result, under most models, the world is an order of magnitude or more richer in several hundred years even if temperature change is extreme and the damages from climate change potentially intolerable. Some models consider the possibility that climate change will directly destroy capital (i.e., such as through destruction of buildings in storms). While growth is somewhat slower in this formulation, the basic result is unchanged.
In this project we consider a simple modification to the standard damage function so that climate change can affect growth. We take a canonical endogenous growth model and modify it so that climate change has an impact on the sectors in the economy that contribute to growth. For example, if an economy is modeled as having an inventing sector and a manufacturing sector, we allow climate damages to affect the output of both sectors. With this simple reformulation, the effects of climate change can be dramatically different than in standard models. Over reasonably long periods, the growth effects dominate, showing that studying the impacts of climate change on growth should be of central concern.
Our goal is to consider this effect in all of the canonical endogenous growth models to see how robust the effect is across model choices. We have solved two models so far and likely will want to consider another two or three. In addition, we are simulating the results of one or more models to understand the likely size of the effects under differing parameter choices.
Characterizing potential future changes in temperature variability across frequencies:
The impacts of climate change on human societies may arise not just from changes in climate means but from changes in climate variability. Many agricultural crops, for example, are highly sensitive to even brief periods of stress temperatures, particularly at certain times of the growing cycle. Crop yields can be strongly affected by changes in temperature variability even in the absence of a change in mean. Understanding potential changes in temperature variability has therefore been a research priority in climate science.
Impacts of climate variability depend on not only the magnitude but also the frequency of variations: day-to-day temperature fluctuations have different consequences than year-by-year differences. We therefore study variability using spectral methods that allow distinguishing timescales of fluctuations. Current projects address climate variability in a number of ways
- comparing variability changes in present-day or preindustrial and and future equilibrated climates
- developing methodologies to characterize variability changes in transient climates
- evaluating the effect of model spatial resolution on temperature variability
In our study of equilibrated climates, we compare pre-industrial and future climate scenarios in two different climate models, CCSM3 and GISS-E2-R, with millennial runs so that each climate state is stationary. Following techniques developed in Leeds et al (2015), we compute integrated variability in distinct frequency bands and show changes over time as the future/present ratio. spectral ratios to explore the temperature variability changes in increased radiative forcing based on climate models. Within CCSM3, we also compare variability changes when warming is driven by two different forcing agents (CO2 and solar radiation).
Characterizing and emulating variability in a transient climate:
Characterizing climate variability is easier in stationary conditions, but the Earth for the foreseeable future will be in a changing or "transient" state, in which temperatures are evolving in response to changed CO2 concentrations. Most archived climate model output also simulates transient states. We therefore seek to statistically describe how variability changes in a transient climate. This statistical exploration also serves our overall research goal is also to use model information to produce "data-driven simulations" for use in impacts assessments. GCM output should not be used directly in impacts assessments, because GCMs do not fully reproduce present-day temperature distributions. Instead we develop methods of generating simulations of future temperatures that combine observational records with GCM projections of changes in variability (covariances). (See Figure 2 for cartoon, also description of emulation research.)
Again using the CCSM3 model, but now an ensemble of transient simulations rather than a single stationary run, we describe a statistical model for the evolution of temporal covariances in a GCM in response to altered CO2 levels. We find that, at least in CCSM3, changes in the local covariance structure can be explained largely as a function of the regional mean change in temperature, with a small term related to the rate of change of warming. (A warming climate will have slightly greater variability than an equilibrium climate at the same temperature.)
The statistical model can then be used to emulate the evolving covariance structure of temperatures, and therefore used to create in data-driven simulations that account for the projections of changes while still retaining fidelity with the observational record. We demonstrate the emulation of variability changes below, training the statistical model on model runs of several CO2 scenarios and using it to emulate changes under another scenario. Variability changes can indeed be described and emulated with a simple statistical model.
Partners: Larissa Nazarenko | Gavin Schmidt
A. Poppick, D. J. McInerney, E. J. Moyer, and M. L. Stein. Temperatures in transient climates: improved methods for simulations with evolving temporal covariances. arXiv preprint arXiv:1507.00683 (2015).
The study of future climate change necessarily involves numerical simulations. However, the state-of-the-art general circulation models (GCMs) used to producing climate projections are highly computationally demanding, so that century-long model runs may take months of calendar time to complete. GCMs are therefore too unwieldy for use in many contexts, especially in policy analysis. Analyses that involve optimal policy determination or uncertainty quantification require repeated iterations of climate projections for different CO2 trajectories, something that is computationally prohibitive with GCMs. We therefore seek techniques that capture (emulate) the behavior of GCMs but in computationally leaner tools that are useful in climate impacts assessments and other policy analyses requiring rapid climate projections.
We have developed a new approach for emulating the output of a GCM under arbitrary forcing scenarios using only a small set of precomputed runs from the model. We express temperature and precipitation are expressed as simple functions of the past trajectory of atmospheric CO2 concentrations, and fit the statistical model using a "training set" of model output (Castruccio et al, 2014). The approach captures the nonlinear evolution of climate anomalies shown in coupled climate models, and, once the statistical model is fit, produces emulated climate output effectively instantaneously. In our 2014 paper, we demonstrated emulation of temperature and precipitation over sub-continental regions. In current work, we have extended emulation to individual model grid cells, by incorporating information about spatial correlations (Figure 1 below).
In the work described above, we demonstrate emulation on a single climate model, CCSM3, with a custom-designed training set. In current work, we are extending the technique to emulate all major state-of-the-art GCMs, using only publicly available archived model output in the CMIP5 archive. We have tested 22 individual CMIP5 models to date; in all the simple emulation approach captures regional temperature evolution well.
Our work in multi-model emulation also includes evaluation of how small training sets can be (in terms of multiple scenarios or realizations) to maintain adequate emulation, and physical interpretation of the fitted emulation parameters for each model. These fitted parameters offer a metric with interpretable physical significance that allows us to characterize and classify climate models.
Feifei Liu Crouch | William Leeds | David McInerney
This project is aimed at characterizing changes in future precipitation suggested by climate models, and using that information along with observations to produce data-driven simulations that can be used to evaluate impacts (e.g. changes in floods, water supply, and agricultural yields).
Climate models robustly predict a rise that global rainfall will rise under warmer climate conditions, and that rain events become more intense (because a warmer atmosphere holds more water vapor). The combination also requires some third adjustment: because the projected increase in rainstorm intensity (about 6% per degree C of warming) is greater than the projected total rise in precipitation (about 3%/degree), some other aspect of precipitation in the climate models must change to compensate. Either storms are initiated less frequently, or rain events are shorter in duration, or rainstorms are smaller in extent, or some combination of these effects. To date no study has determined how exactly precipitation characteristics change, in part because of the statistical challenge of representing precipitation, a complex, non-Gaussian process with strong spatiotemporal correlations.
We have developed a new approach to characterizing changes in precipitation in future climates that moves away from considering time series at individual locations, and instead uses image-processing algorithms to identifying individual rainstorm events and capture the evolution of storms in space-time space. The effort does require a very large volume of data: observations and model output at very high spatial resolution on time steps of a few hours or less. In this study, we use dynamically downscaled climate model output (CESM downscaled with WRF) at 12 km, 3 hourly resolution. We compare model output to real-world precipitation estimated from weather radar: the NCEP Stage IV data product at 4 km, 1 hourly resolution.
FIGURE 1: EXAMPLE OF RAINSTORM OBJECTS CONSTRUCTED BY OUR RAINSTORM IDENTIFICATION AND TRACKING ALGORITHM, FROM MODEL OUTPUT IN (LEFT) SUMMER AND (RIGHT) WINTER. COLORS REPRESENT DIFFERENT IDENTIFIED RAINSTORM OBJECTS. MOVEMENT OF SAME-COLORED REGIONS SHOW EVOLUTION OF INDIVIDUAL RAINSTORMS OVER TIME. IN SUMMER, CONVECTIVE STORMS INITIATE OVER MUCH OF THE U.S., ESPECIALLY IN THE WEST. RAINFALL IN WINTER OCCURS PRIMARILY IN LARGE-SCALE STORMS.
In the model output studied, we find that storms do indeed become more intense in future climate conditions but that their initiation, duration, and trajectories are nearly unchanged. The compensating adjustment is that storms simply become smaller in spatial extent.
The goal of this work is not only improving scientific understanding but developing statistical methods for generating future precipitation scenarios that combine information from both observational data and climate model runs. That is, simulations should capture the changes shown in models but preserve the statistical characteristics of real-world rainfall. Altering the spatial extent of rain events in data is non-trivial. Figure 2 below shows the application of a transformation algorithm applied to Stage IV radar data. In general, we hope this work will enable new ways of thinking about precipitation events in both meteorology and climatology.
Won Chang. A conditional simulation approach to future precipitation scenario generation. Poster presented at: Data Assimilation Workshop, STATMOS Summer School. May 2015.
Won Chang. A conditional simulation approach to future precipitation scenario generation. Poster presented at: STATMOS Annual Meeting. University of Chicago, Chicago, IL. September 2014.
Changes in U.S. temperature extremes under increased CO2 in millennial-scale climate simulations
Changes in extreme weather may produce some of the largest societal impacts from anthropogenic climate change. (At present, weather damages are dominated by rare events that happen only every several decades or more.) However, predicting future changes in those rare events is not possible using only the short observational record. Insight on changes in extremes must come from climate models, where we can generate long simulations.
In this project, we use millennial runs from the Community Climate System Model version 3 (CCSM3) in equilibrated pre-industrial and future (700 and 1400 ppm CO2) conditions to examine both how extremes change and how well these changes can be estimated as a function of run length.
Extreme value theory provides a means of estimating the far tails of distributions. We estimate changes to distributions of future temperature extremes by fitting annual maximum and minimum temperatures to generalized extreme value (GEV) distributions.
Using 1000-year preindustrial and future timeseries of temperatures in the contiguous United States, we show that changes in extremes are different in summer and winter. In winter, cold extremes generally warm much more than the mean shift in wintertime temperatures (Figure 1), while in summer, warm extremes generally warm only as much as the shift in means.
The changes in winter extremes involve more than a simple shift in magnitudes. The scale and shape of their distributions also change. This effect is best demonstrated by plotting the "return level" of extreme events, i.e. the magnitude of an extreme that recurs on a specified timescale. Figure 2 below shows changes in the 2-year to 100-year return levels for wintertime cold extremes. Over ocean regions (red lines at lower left are the Pacific off California; those on right are the Gulf of Mexico and the Atlantic), changes are relatively flat across time, suggesting a simple shift in the distribution of extremes. Over land regions, however, changes show complex behavior. In the inland Southwest U.S., changes in the 100-year "extreme extremes" are larger than changes in the 2-year "moderate extremes". Further north, this pattern is reversed.
The 1000-year runs used here allow us to accurately determine changes in even 100-year extremes, but in practice, most modeling studies must rely on shorter runs. GEV modeling should allow estimation of infrequent events using relatively short time series, but it is important to understand its limitations for climate data. We therefore repeat the estimation of selected return levels (20-, 50-, and 100-year extremes) using segment of the timeseries of varying length. The resulting estimates can become very poor when the timeseries is of comparable length or shorter than the return period of interest: that is, a 100-year model run cannot be used to reliably estimate changes in 100-year extremes. These results suggest caution when attempting to use short observational records or model runs to infer changes in extreme events.
Although many impacts of climate change remain uncertain, climate models robustly project that land surfaces become drier in warmer climate conditions. General circulation models (GCMs) almost universally show reduced soil moisture over much of the globe, even in locations where average rainfall increases (Figure 1). This change is a significant societal concern both because of potential effects on food production and because drier soils feed back on local weather conditions, exacerbating heat waves. (In wetter conditions, evaporative cooling buffers extreme temperatures.) These adverse consequences make understanding the surface-atmosphere interactions that control soil moisture a research priority.
While the model projections of drying are plausible, the model representation of soil moisture dynamics have not been fully validated. Soil moisture observations are sparse and controversial, and there is as yet no consensus on whether the historical record suggests that droughts are increasing. The RDCEP soil moisture project is using a unique observational resource to characterize soil dynamics and test the representation of soil moisture in models. We make use of the Department of Energy’s Atmospheric Radiation Measurement Climate Research Facility at the Southern Great Plains site in Kansas and Oklahoma (ARM-SGP), where soil moisture, moisture fluxes, and meteorological variables have been measured at hourly intervals over more than a decade, as a test-bed for characterizing statistical relationships between soil moisture and moisture fluxes and local forcing variables (e.g. temperature, precipitation, wind speed, and relative humidity). We then compare to output from the soil moisture model (CLM) used in the Community Earth System Model, one of the most widely-used climate models, to understand how well models capture the physics controlling soil moisture.
This project is supported in part by a University of Chicago - Argonne National Laboratory Strategic Collaboration Initiative seed grant.
Shanshan Sun. Statistical exploration of processes controlling soil moisture in present and future climates. Poster presented at: ARM/ASR Joint User Facility/PI Meeting, Tyson’s Corner, VA. March 16-19, 2015.
We are working on robust control in simple climate models coupled with economic growth, finance, and macro models. This effort is important for carbon pricing, understanding the impacts of robustly accounting for climate change, technical change, and other sources of uncertainty, not only on economic growth and macroeconomics, but also on asset prices as well as insurance pricing and green-house gas emission pricing.
Within the context of climate models are three contrasting approaches to robustness:
- adapting to potential model misspecification
- robustness adjustments for prior/posterior uncertainty
- “smooth” models of ambiguity aversion
Decision theoretic frameworks exist for all of these applications, but their full consequences for economic models with climate change remains to be explored. To accomplish this we are developing numerical methods to support these analyses. In terms of discounting, our focal point is on the consequences of uncertainty. There is an extensive literature from asset pricing and macroeconomics that uses stochastic discounting as a device to adjust discount rates for cash flow riskiness. We are drawing on this literature and incorporating compensations for aversion to ambiguity and concerns about model misspecification into models that feature explicit uncertainty and climate impacts on the economy. We are also contrasting pricing implications from market economies with social valuation.
Coupled Economic-Climate Models with Carbon-Climate Response: The economics of global climate change is characterized by fundamental uncertainties including the appropriate reduced forms for climate dynamics, the specification of economic damages resulting from climate change, and mechanisms by which these damages will affect long-run economic growth. We have developed and implemented a novel theoretical and computational integrated assessment modeling approach that is a well-grounded means of summarizing the fundamental relationship between human activity and the global climate for purposes of economic analysis. Using a dynamic integrated assessment framework, this project makes several contributions to improving the analysis of these uncertainties:
- First, we incorporate the cumulative climate response (CCR) function developed by Matthews et al. for representing the basic relationship between anthropogenic carbon emissions and increases in global mean temperature in a manner that is more directly policy relevant than the usual approach based on the equilibrium climate sensitivity.
- Second, we adapt the tools developed by Hansen, Sargent and others for robustness analysis to address underlying model uncertainty in both economic and climate dynamics.
- Third, we allow climate change to affect economic growth directly, in addition to its effect on output
We then develop and study a simple analytical model that yields insights and results on the key implications of these assumptions, as well as facilitating the interpretation of numerical results from a more general model. Among our findings is that the presence of robustness may result in either a decrease or increase in the optimal carbon tax and energy usage, depending among other factors on societal preferences.
Numerical Methods: In order to perform robustness analysis over a variety of fundamentally different models, we are developing PDE-based numerical method for solving robust stochastic equilibrium models in continuous time with optimal control. This approach allows us to uncover solutions for the entire state space and to perform a quick sweep over variety of models and parameters.
The 1930s dust bowl included the worst single-year drought in the western U.S. in the last millennia, as well as some of the worst droughts in the U.S. as a whole in the last century. We are investigating the agricultural impacts of a similar drought in the present-day and in the near future, when temperatures are warmer and CO2 concentrations are higher. Initial results suggest that a drought of this magnitude in current climate would result in production losses significantly larger than any observed drought in recent history.
Initial results indicate that by the middle of the 21st century, warmer temperatures and a dust-bowl-like drought may lead to a decrease in maize production of up to 80%. However, the 1930s Dust Bowl was unique in not only its intensity, but also its persistence. Droughts were widespread in three out of five years, and temperatures elevated throughout the decade. The implications of successive droughts on crop prices remain unclear. Current research is focusing on linking persistent large-scale drought to crop prices through stock-to-use ratios.
Energy modeling, numerical modeling based on economic principles, has become the dominant analytical tool in U.S. energy policy. Energy models are widely used by researchers across the public and private sectors. However, the widespread application of these models in policy analysis poses challenges to decision-makers. We are developing a framework and analysis that demonstrate how non-Bayesian decision rules can address fundamental model uncertainty in the domain of energy policy, technological change, and greenhouse gas abatement.
Numerical modeling based on economic principles has become the dominant analytical tool in U.S. energy policy. Energy models are now used extensively by public agencies, private entities, and academic researchers, and in recent years have also formed the core of integrated assessment models used to analyze the relationships among the energy system, the economy, and the global climate. However, fundamental uncertainties are intrinsic in what has become the typical circumstance of multiple models embodying different representations of the energy-economy, and producing different policy-relevant outputs that model users are compelled to interpret as equally plausible and/or valid. Because the policy implications of these outputs can diverge substantially, policy-makers are confronted with a significant degree of model-based uncertainty and little or no guidance as to how it should be addressed.
This problem of “model uncertainty” has recently been the focus of work in macroeconomics, where scholars have studied the problem of how a decision-maker should proceed in the face of uncertainty re- garding the correct model of an economic system that is the object of policy. We focus on analyzing a low-dimensional numerical integrated assessment model using the “minimax regret” metric. Specifically, we have demonstrated that deep uncertainty regarding energy-related technological change can be addressed using this approach. Our findings include comparison with expected cost minimization, to show how the interaction of solution methods and model structure affect the influence of this form of deep uncertainty on low-run CO2 emissions abatement policies. We also examined other methods assuming some prior distribu- tions over uncertain parameters for analyzing the difference between our robust solution and the non-robust solution from those methods.
We demonstrate that the fundamental model uncertainty can be represented and analyzed in the context of energy policy problems determining optimal CO2 abatement strategies. The robust solution from min-max regret method is significantly different with any solutions from sensitivity analysis over uncertain parameters or those methods assuming prior distributions over uncertain parameters. The following figure shows the difference of the robust min-max regret solution over all three uncertain parameters (the red line) and others with min-max regret solution over only one uncertain parameter, Technical Change level, while the other two parameters are used for sensitivity analysis.
One of the most difficult aspects of evaluating the impacts of climate change on future agricultural production is compensating for change in technology over many decades.
An acceleration of technological research with regards to agriculture, commonly referred to as “The Green Revolution”, occurred worldwide from roughly 1940 to the late 1960’s. Practices in irrigation, management, and hybridization were changed drastically.
A dataset was compiled from the Illinois Digital Environment for Access to Learning and Scholarship (IDEALS). It is a unique record of the evolution of technology and management practices during the Green Revolution, making it one of the very few lengthy, continuous, and consistent records of technological change in agriculture that exists in the entire world.
This historical record will be used to better understand the range of possible technology pathways for the next several decades and what they mean for food security in the face of a changing climate.
Recently enacted state Renewable Portfolio Standards (RPSs) collectively require that U.S. electricity generation by non-hydro renewables more than double by 2025. These goals are not certain to be met, however, because many RPSs apply cost caps that alter requirements if costs exceed targets. We have analyzed the 2008 Illinois RPS, which is fairly typical, and have found that at current electricity prices, complete implementation will require significant decreases in renewables costs, even given the continuation of federal renewables subsidies. While full implementation is possible, it is not assured.
We also find that the statutory design raises additional concerns about unintended potential consequences. First, the fact that wind power and solar carve-outs fall under a single cost cap leaves each technology vulnerable to the economics of the other. In failure mode, a less cost-effective technology can curtail deployment of a more cost-effective one. Second, adjacent-state provisions mean the bulk of the wind power requirement under the Illinois RPS can be met by existing facilities in Iowa, where new builds will likely also occur. We conclude that the Illinois RPS, and likely those of many other states, appear to combine objectives inherently in conflict and whose conflicts can create legislative failure: preferences for local jobs, for specific technolo- gies, for environmental benefits, and for low costs. Since RPSs are the principal policy mechanisms in the U.S. at present for combating climate change, it is important to revisiting existing legislation if necessary to ensure legislative success. The Illinois analysis can provide an example and guidelines for other states that will face similar pressure on their RPSs in the near future.
The social cost of carbon, defined as the present value change in consumption due to an incremental change in carbon emissions, is used by federal agencies in cost-benefit analysis of any regulation that changes emissions. In 2009, the Oce of Management and Budget, through an interagency process, estimated the social cost of carbon and required all agencies to use their estimate. Their central value was $21.40/tCO2 with range of -$2.7/tCO2 to $142.4/tCO2.
The government’s estimate of the social cost of carbon, which is consistent with estimates by private researchers, implicitly assumes that the economic growth continues even with substantial temperature increases. In one of the models, temperatures increase by 6.3 by the year 2300. With this temperature increase, the global economy is roughly 30 times larger than it is today on a per capita basis for the model. The apparent reason for this estimate is that damages from climate change do not affect growth. They are modeled as reducing usable output in a given year with exogenously specified growth continuing regardless of climate damages.
We estimate the social cost of carbon when climate change reduces the growth rate of the economy. We use the same model as the OMB but modify it so that a fraction of damages from climate change affect the growth rate rather than simply reducing usable output. Growth might be reduced, for example, because resources are diverted from research to adaptation to climate change. Even relatively small growth effects produce substantial change in the social cost of carbon suggesting that (1) the estimates of the social cost of carbon are not robust to modest changes in the estimating model and (2) research into the impacts of climate change should focus on growth effects rather than level effects.
Chicago Climate Online (C2O), a policy initiative of the University of Chicago Law School, is a climate change policy research tool that seeks to become a preeminent policy resource for researchers. It combines features of Wikipedia -- with summaries of current knowledge on climate policy topics -- and a database -- with a list of research articles on those topics. It is an open-access research tool that allows users to also upload, create, and organize their own content.
The focus of C2O is on policy issues, not climate science. Although climate science continues to develop rapidly, we take the core of existing climate science as given. C2O is not a place to debate the validity of this science. The interconnectedness of policy and science, however, makes clear demarcation between the two impossible. C2O was developed with a primary focus on climate change policy, with inclusion of more science-focused topics and references only as necessary.
There is great uncertainty about the impact of anthropogenic carbon on future economic wellbeing.
We use DSICE, a dynamic stochastic general equilibrium (DSGE) model of integrated climate and economy to account for abrupt and irreversible climate change. DSICE is an extension of the DICE2007 model of William Nordhaus, which incorporates beliefs about the uncertain economic impact of possible climate tipping events and uses empirically plausible parameterizations of Epstein-Zin preferences to represent attitudes towards risk.
In this series of studies we model climate shocks in the form of a stochastic tipping points, and investigate the impact of a tipping point externality on optimal mitigation policy. We find that the uncertainty associated with anthropogenic climate change imply carbon taxes much higher than implied by deterministic models. This analysis indicates that there is much greater urgency to immediately enact significant GHG policies than implied by DICE2007 and similar models that ignore uncertainty.