BPEA Conference Drafts, September 9, 2021 The Social Cost of Carbon: Advances in Long-Term Probabilistic Projections of Population, GDP, Emissions, and Discount Rates Kevin Rennert, Resources for the Future Brian C. Prest, Resources for the Future William A. Pizer, Resources for the Future Richard G. Newell, Resources for the Future David Anthoff, University of California, Berkeley Cora Kingdon, University of California, Berkeley Lisa Rennels, University of California, Berkeley Roger Cooke, Resources for the Future Adrian E. Raftery, University of Washington Hana Ševčíková, University of Washington Frank Errickson, Princeton University Conflict of Interest Disclosure: This paper draws on several years of work by the Social Cost of Carbon (SCC) Initiative at Resources for the Future (RFF), an initiative funded by private individual and foundation donors. Raftery’s work was supported by NIH/NICHD grant R01 HD-070936. Other than the aforementioned, the authors did not receive financial support from any firm or person for this article or from any firm or person with a financial or political interest in this article. They are currently not an officer, director, or board member of any organization with an interest in this article. Kevin Rennert Resources for the Future Brian C. Prest Resources for the Future William A. Pizer Resources for the Future Richard G. Newell Resources for the Future David Anthoff University of California, Berkeley Cora Kingdon University of California, Berkeley Lisa Rennels University of California, Berkeley Roger Cooke Resources for the Future Adrian E. Raftery University of Washington Hana Ševčíková University of Washington Frank Errickson Princeton University The Social Cost of Carbon: Advances in Long-Term Probabilistic Projections of Population, GDP, Emissions, and Discount Rates ABSTRACT The social cost of carbon (SCC) is a crucial metric for informing climate policy, most notably for guiding climate regulations issued by the US government. Characterization of uncertainty and transparency of assumptions are critical for supporting such an influential metric. Challenges inherent to SCC estimation push the boundaries of typical analytical techniques and require augmented approaches to assess uncertainty, raising important considerations for discounting. This paper addresses the challenges of projecting very long-term economic growth, population, and greenhouse gas emissions, as well as calibration of discounting parameters for consistency with those projections. Our work improves on alternative approaches, such as nonprobabilistic scenarios and constant discounting, that have been used by the government but do not fully characterize the uncertainty distribution of fully probabilistic model input data or corresponding SCC estimate outputs. Incorporating the full range of economic uncertainty in the SCC underscores the importance of adopting a stochastic discounting approach to account for uncertainty in an integrated manner. 1 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson I. Introduction As the primary economic measure of the benefits of mitigating climate change, the social cost of carbon (SCC) has been called “the most important number you’ve never heard of.”1 Put simply, the SCC is an estimate, in dollars, of the economic cost (i.e., damages) resulting from emitting one additional ton of carbon dioxide (CO2) into the atmosphere. Conversely, it represents the benefit to society of reducing CO2 emissions by one ton—a number that can then be compared with the mitigation costs of reducing emissions. There are analogous metrics for methane (CH4) and nitrous oxide (N2O). The SCC has deep roots in economics. Indeed, many textbooks use carbon emissions and the resulting climate change as the canonical example of an externality that must be addressed through Pigouvian taxation or other means to maximize human welfare. In particular, basic economic theory recommends that an optimal tax on CO2 emissions (a “carbon tax”) be set equal to the SCC, for which marginal damages are measured along an optimal emissions trajectory (e.g., Pigou 1920; Nordhaus 1982).2 But the relevance and application of the SCC go well beyond its role in determining an optimal Pigouvian tax. As political leaders and stakeholders debate both the broad outlines and the fine details of policies to reduce carbon dioxide emissions, the SCC lies in the background as a remarkably important calculation, used by the US federal government for more than a decade for developing vehicle fuel economy standards and power plant emissions rules. Such analyses have been a mainstay of the regulatory rulemaking process since Executive Order 12291 was issued more than 40 years ago.3 The SCC also was the basis for the value of federal tax credits for carbon capture technologies, beginning in 2018,4 and zero-emissions credits for nuclear power in New York State.5 The power grid operator for New York is working to include the SCC as a cost “adder” 1 https://www.economist.com/united-states/2017/11/16/the-epa-is-rewriting-the-most-important-number-inclimate-economics;https://www.bloomberg.com/news/articles/2021-01-22/how-do-you-put-a-price-on-climatechange-michael-greenstone-knows 2 This result derives from a simple model lacking many real-world complications such as leakage, taxinteraction effects, and other market distortions like R&D spillovers, but it represents a reasonable approximation. 3 Executive Order 12291 was the original Reagan-era guidance for benefit-cost analysis, later superseded by Executive Order 12866 in 1993. 4 https://www.whitecase.com/publications/insight/carbon-capture/us-tax-credit-encourages-investment 5https://documents.dps.ny.gov/search/Home/ViewDoc/Find?id={44C5D5B8-14C3-4F32-8399F5487D6D8FE8}&ext=pdf, page 131. Conflict of Interest Disclosure: BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson on top of energy supply bids submitted by power plants, thereby reflecting social costs into market prices and plant dispatch.6 Many other states have used the SCC as the basis for climate policies7 and as a benchmark against which proposed carbon prices are compared.8 Proposed applications include federal procurement decisions9 and royalties on oil and gas leases on federal land (Prest 2021; Prest and Stock 2021).10 Construction of the SCC and the benefits of reducing emissions is also somewhat distinct from the distribution of benefits. That is, because the consequences of climate change will be different for different communities (country, region, income, social identity), the benefits of mitigating climate change will similarly vary. For example, rising temperatures are likely to create heavier burdens on already hot (and often poor) countries like Bangladesh than for cold (and often rich) countries like Norway. Putting greater weight on dollar-value effects in poorer communities—that is, equity weighting (Errickson and others 2021)—is not the current standard practice, however. Rather, the distribution of effects (when available) is presented alongside the aggregate, unweighted summary. Weighting becomes important as we gain understanding of the distribution of effects. Estimation of the SCC goes back to Nordhaus (1982) and has recently seen increasing prominence. In 2018, the Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel was awarded to William Nordhaus (alongside Paul Romer) for his work incorporating climate change into economic analysis, including the role of the SCC in informing policy. The SCC is typically estimated using integrated assessment models (IAMs), such as the DICE model developed by Nordhaus. IAMs couple climate and economic models to estimate the economic effect of an incremental pulse of CO2 emissions (in tons) on climate and economic 6 https://www.nyiso.com/carbonpricing 7 https://costofcarbon.org/states 8 https://www.rff.org/publications/explainers/carbon-pricing-101/, https://www.rff.org/publications/datatools/carbon-pricing-bill-tracker/, https://www.wsj.com/articles/BL-EB-7156 9 https://www.whitehouse.gov/briefing-room/presidential-actions/2021/05/20/executive-order-on-climaterelated-financial-risk/, Sec. 5(ii). 10 Many aspects of climate policy decisions are not necessarily tied to the SCC. Essentially, those include all policy design issues beyond measuring benefits and balancing with costs, such as optimal R&D spending amid knowledge spillovers, cost-effective policy design (e.g., uniform standards versus flexible incentive-based policies), interactions between policies (e.g., Goulder 1995; Barrage 2020a, 2020b; Borenstein and others 2019), and differences in the distribution of the costs (and in certain cases government revenues) associated with different policy approaches. These are distinct from the question of estimating the marginal benefits of reducing emissions. 3 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson outcomes. The net present value of changes in economic outcomes, divided by the number of tons in the pulse, delivers the SCC. However, many IAMs used in SCC estimates have not kept up with rapidly evolving climate, economic, and demographic science. Moreover, as Nordhaus (1982) noted, many of the factors underlying the SCC are deeply uncertain—notably, our understanding of Earth’s climate, the effect of climate change on economic outcomes, and future socioeconomic conditions that capture the discounted consequences from changes in emissions today. The need for robust policy decisions implies we should update the SCC over time to refine central estimates and the range of uncertainty as our scientific understanding progresses. In this paper, we review efforts to update determinants of the SCC to reflect the best available science, based on the recommendations of a 2017 committee report by the National Academies of Sciences, Engineering, and Medicine (NASEM 2017). This updating is particularly relevant in light of Executive Order 13990 (January 20, 2021), which reestablished the Obama-era Interagency Working Group (IWG) on the Social Cost of Carbon and directed it to update the SCC. We also note other research efforts on updating the SCC. The NASEM report recommended creating an integrated framework comprising four components (“modules”) underlying the SCC calculation: • socioeconomics: probabilistic projections of population, gross domestic product (GDP), and emissions over multiple centuries; • climate: an improved model of Earth’s climate system and climate change; • damages: the economic consequences of climate change, based on recent studies; and • discounting: aggregated present-value marginal damages and stochastic discount factors that correctly reflect the uncertain socioeconomic drivers (above). Figure 1 shows how the modules fit together, including how socioeconomics affect emissions trajectories, which are input into the climate model to project future temperatures. These temperatures are converted into a stream of future economic losses in the damages model (also influenced by socioeconomic trajectories), which are then discounted to a present value in the discounting module. 4 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson Figure 1. Modularized Approach to Estimating SCC Because the SCC represents the marginal effect of an incremental ton of emissions, this entire model is run twice—once as a baseline and once with a small pulse of additional emissions (Figure 2). The resulting change in the stream of economic damages per ton from this emissions pulse, in present value, is the SCC. More generally, when inputs to a module are uncertain (e.g., because of uncertainty about the climate’s response to emissions or about future economic growth), modelers have incorporated that uncertainty through Monte Carlo analyses by taking draws of (potentially correlated) probability distributions of each random variable. The result is a distribution of SCCs, often summarized by its expected value. For example, the federal government’s current interim value of $51/ton CO2 reflects the expected value of the SCC over uncertainty in the climate’s warming response and scenarios of economic growth and population, at a 3 percent constant discount rate. 5 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson Figure 2. Estimating Social Cost of Carbon under Uncertainty Note: Estimation involves a baseline case (solid) versus a pulse of emissions (red dashed lines and areas). Shading depicts probability distributions on projections. The NASEM report noted that IWG SCC estimates, including the current interim $51/ton SCC value, used somewhat dated and often simplistic modules. For example, five socioeconomic “scenarios” were not developed with formal probabilities attached but were treated as equally likely. The scenarios did not incorporate the work done by economists, demographers, and statisticians to estimate and quantify uncertainty around long-term economic and population growth. The discounting approach also used a constant discount rate, rather than treating the discount rate as stochastic; that choice becomes increasingly important as the decision horizon extends into the future. The IWG noted the potential for a declining term structure and correlation between the discount rate and damage outcomes but did not consider an explicit stochastic discount factor that accounts for both future discount rate uncertainty and, through uncertain socioeconomic outcomes, correlation with the damages being discounted. To address such shortcomings, the NASEM report issued recommendations for improvement, which Executive Order 13990 specifically directed the IWG to consider. 6 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson This paper documents recent work that has improved the scientific basis for the modules so that the IWG can update the SCC to reflect the best available science. Section II discusses the improved socioeconomic module, with long-term probabilistic projections of population, economic growth, and emissions. Section III illustrates how an incremental ton of emissions translates into climate and economic effects (“damages”). Section IV discusses the crucial role of the discount rate, given recent research on declining equilibrium interest rates, plus the importance of using stochastic discount factors and the shadow price of capital for valuing effects on investment. Section V then combines these elements into a simplified model of the SCC, with associated uncertainty bounds for the socioeconomic, climate, damages, and discounting components. Finally, section VI concludes and raises issues that await future research. II. Economic and Demographic Drivers of Climate Effects Assessments of damages from climate change are influenced by projections of population, economic growth, and emissions. Population growth can drive emissions and increase or decrease total economic exposure to the health effects of climate change. Economic growth similarly affects both the level of expected emissions and the resulting damages, which are often estimated to scale with economic activity (Diaz and Moore 2017a). For example, the monetization of mortality consequences typically depends on per capita income. Economic growth projections can also influence the SCC through the discount rate if estimates are calculated using Ramsey-like discounting, where the discount rate is a function of the rate of economic growth: higher (lower) growth scenarios will yield a higher (lower) discount rate. Finally, projections of global emissions determine the background state of the climate system against which damages from an additional pulse of emissions are measured. Estimates of the SCC are highly sensitive to socioeconomic and physical projections (Rose and others 2017), but revised estimates have been based primarily on changes in socioeconomic projections, not on improved understanding of the climate system (Nordhaus 2017b). Explicitly considering realistic, probabilistic socioeconomic projections is thus important for improving the characterization of both the central tendency and the uncertainty in the SCC. 7 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson A robust characterization of socioeconomic contributions to SCC estimates would ideally incorporate probabilistic projections of population, economic growth, and emissions. The particular requirements of SCC estimation, however, pose significant challenges for generating such projections. One is the time horizon: given the long-lived nature of greenhouse gases in the atmosphere, the SCC needs to account for discounted damages 200 to 300 years into the future (NASEM 2017). Yet nearly all projections end at year 2100 and are often scenario-based rather than probabilistic. New probabilistic projections that extend well into the future are required. Another challenge is that although climate change can be projected from emissions scenarios consistent with globally aggregated projections of economic activity and population growth, the resulting climate damages are most appropriately estimated at a regional (or even local) scale. Thus they require geographically disaggregated estimates of GDP and population. A third challenge is that the future path of emissions likely depends on uncertain improvements in technology and on the scale and success of policy interventions outside the range of the historical record. That is, whereas historical data may be a reasonable guide to forecast population and economic activity, the same is not true for emissions. The SCC should be measured against our best estimate of future emissions, inclusive of future mitigation policies except the one under analysis. The fourth issue is the interrelated nature of these variables: the projections for each variable must be consistent with one another. For example, emissions intensity might be lower with higher economic growth (and its associated wealth and technological improvements). Past Approaches to Socioeconomic Projections In lieu of using fully probabilistic socioeconomic projections, researchers have typically turned to socioeconomic scenarios, which can provide consistency across analyses and still incorporate specific narratives. The IWG adopted a scenario approach in its initial estimates (IWG SCC 2010), and these same scenarios support the interim estimates put forward by the Biden administration in January 2021 (IWG SCC 2021). The IWG used five socioeconomic scenarios drawn from the Energy Modeling Forum (EMF) 22 (Clarke and Weyant 2009) modeling exercise, selected to roughly span the range of emissions outcomes in the full set of EMF 22 scenarios and thus represent uncertainty across potential socioeconomic projections. Only one of the scenarios represented future climate policy. The IWG extended the five 8 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson scenarios to 2300 by assuming that GDP and population growth each decreased linearly to zero in 2300. The five scenarios were assigned equal probability for computing an expected value for the SCC (no such probabilistic interpretation existed for the EMF 22 work). The IWG scenarios were critiqued for not spanning the uncertainty in a full set of relevant socioeconomic variables (e.g., GDP, population) or reflecting the broader scenario literature overall (Rose and others 2014; Kopp and Mignone 2012). The resulting SCC estimates, then, may not reflect damage calculations based on the full range of expected variation. The NASEM panel noted that the IWG did not provide a rationale for its scenario weighting or the choice to extend the scenarios from 2100 to 2300 by assuming that GDP and population growth each decreased linearly to zero. The panel recommended using a combination of statistical methods and expert elicitation to generate a set of probabilistic long-term projections for each variable. Subsequently, a multidisciplinary research effort developed the Shared Socioeconomic Pathways (SSPs) (Riahi and others 2017), scenarios intended primarily to support the assessment efforts of the Intergovernmental Panel on Climate Change (IPCC). Each of the five SSPs consists of quantified measures of development and an associated narrative describing plausible future conditions that drive the quantitative elements. The SSPs end in 2100, but researchers have offered extensions to 2300 (Nicholls and others 2020; Kikstra and others 2021). The SSPs are freely available and comprehensive, have an extensive publication record, and are expected to be used in the IPCC’s Sixth Assessment Report. For these reasons, we use the SSPs as our primary point of comparison. Scenarios in general, and the SSPs in particular, do not come (as the IWG assumed) with associated probabilities. That limits their utility in evaluating uncertainty. Although the SSP authors have themselves cautioned against using the SSPs in a probabilistic fashion, Ho and others (2019) sought to address this limitation through an expert survey assessing the likelihood of each SSP. Others have sought to guide scenario usage by characterizing the plausibility of various scenarios (Stammer and others 2021). Even without formal probabilities, in practice, the SSPs are often interpreted in modeling exercises as representing the uncertainty between highemissions (SSP5) and low-emissions (SSP1) futures, at times with the implication that the difference represents a “no policy” counterfactual versus a “likely policy” scenario. This has led 9 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson to a recent debate over the viability of the high-emissions scenario, given the current pace of technology evolution, among other factors (Hausfather and Peters 2020). Previous efforts to quantify the uncertainty of socioeconomic projections over a century are limited. Raftery and others (2017) used a statistical approach to generate density functions of country-level economic growth per capita, population, and carbon intensity (CO2/GDP) to project a density of future emissions trajectories via the IPAT equation (Commoner 1972), similar to our socioeconomic approach. Müller, Stock, and Watson (2021, hereafter MSW) employed a Bayesian latent factor model that projects long-run economic growth based on lowfrequency variation in the historical data of country-level GDP per capita.11 Christensen and others (2018) conducted an expert survey of economists to quantify the 10th, 50th, and 90th percentile ranges of economic growth for six groupings of countries. Comparing results with the SSP ranges, they found that the SSPs underestimated the range of uncertainty expected by the experts and that using the increased range for economic growth with the DICE model suggested that emissions were also underrepresented by the SSPs. The NASEM report noted that statistical models based solely on historical data are unlikely to fully inform the variability of future projections over centuries, suggesting caution in using raw outputs from statistical models over long time scales. This concern led the NASEM panel to recommend using formal expert elicitation to quantify the uncertainty around future long-run projections, which can then be used to augment projections from statistical models. We next describe efforts undertaken by the Resources for the Future (RFF) SCC Initiative and collaborators to build on both statistical and expert-based approaches to generate distributions of projections of population and GDP per capita at the country level, plus distributions of the three primary greenhouse gases (CO2, CH4, and N2O) at the global level. The resulting probabilistic distributions, collectively referred to as the RFF Socioeconomic Projections (RFF-SPs), fully incorporate the NASEM recommendations for generating an improved socioeconomic module for SCC estimation. Probabilistic Population Projections to 2300 11 The MSW method extends the approach provided in Müller and Watson (2016), which was suitable only for global estimates of economic growth, to generate internally consistent growth projections at the country level. 10 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson Methods. To develop probabilistic, country-level population projections through 2300, we start with the fully probabilistic statistical approach that has been used since 2015 by the United Nations (UN) for its official population forecasts to 2100. We then extend the statistical model to 2300, incorporating feedback and improvements suggested by a panel of nine leading demographic experts that we convened to review preliminary results. This work is detailed in Raftery and Ševčíková (2022, forthcoming). The UN uses a probabilistic method built on the standard deterministic cohortcomponent method of population forecasting (Preston, Heuveline, and Guillot 2001). This method projects forward the three components of population change: fertility, mortality and migration, broken down by age and sex. The probabilistic method builds Bayesian hierarchical models for each of the three components and projects them forward probabilistically using a Markov chain Monte Carlo (MCMC) method, which produces a large number (typically 1,000-2,000) of trajectories of future numbers of births, deaths, and migration events in each country by age and sex. Each trajectory of fertility, mortality and migration is then combined to give a trajectory of future population by age and sex in each country. These 1,000-2,000 trajectories of population numbers in turn approximate a probability distribution for any population quantity of interest (Raftery and others 2012; Raftery, Alkema, and Gerland 2014; Gerland and others 2014). Fertility is projected by focusing on each country’s Total Fertility Rate (TFR), which is the expected number of children a woman would have in a given period if she survived the reproductive period (typically to age 50) and at each age experienced the age-specific fertility rates of that period. The UN models the evolution of fertility in all countries using a Bayesian hierarchical model that divides it into three phases depending on where it lies in the fertility transition from high to low fertility (pre-transition, transition, post-transition). It then fits a time series model to each phase, accounting for spatial correlation between countries (Alkema and others 2011; Raftery, Alkema, and Gerland 2014; Fosdick and Raftery 2014; United Nations 2019; Liu and Raftery 2020).12 Mortality is similarly 12 The TFR has evolved in a similar way in all countries. Typically, in pre-industrial times, the TFR for a typical country was high (in the range 4-8 children per woman). Then, usually after the onset of industrialization, it started to decrease. After a bumpy decline lasting several decades to a century, the TFR flattened out at a level below the 11 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson projected by focusing on life expectancy at birth.13 This is projected by another Bayesian hierarchical model for all countries for both sexes (Raftery and others 2013; Raftery, Lalic, and Gerland 2014). The UN has traditionally projected net international migration for each country deterministically by assuming that it would continue in the future at the same rate as currently (United Nations 2019). We extended this UN method, designed for projections to 2100, out to 2300 and preliminary results were reviewed by a panel of nine expert demographers that we convened. While broadly supportive, the panelists were in agreement that the resulting uncertainty bounds for TFR in 2300 were too narrow, and that in particular the lower bound of the 95% prediction interval for world TFR in 2300 (1.66) was too high. A lower bound of 1.2 children per woman for the world TFR in 2300 was suggested as a more plausible lower bound. We incorporated this recommendation by adding a worldwide random walk component to the TFR model. Experts on the panel also suggested that international migration should be projected probabilistically, in line with the general approach, rather than deterministically as done by the UN. We implemented this by projecting net international migration using a Bayesian hierarchical model (Azose and Raftery 2015; Azose, Ševčíková, and Raftery 2016). We additionally implemented the final panel recommendation, to impose constraints on population density to prevent unrealistically high or low population numbers in some age groups in some countries. Results. The resulting population projections for 2300 for the world as a whole and for the continents, are shown in Figure 3. They show that total world population is likely to continue to increase for the rest of the 21st century, albeit at a decreasing rate, to level off in the 22nd century, and to decline slightly in the 23rd century. Uncertainty for 2300 is replacement rate of about 2.1 children per woman. This decline is called the fertility transition. After the end of the fertility transition, the TFR has fluctuated without a clear trend, mostly staying below the replacement rate. For example, in the US, the TFR was around 7 children per woman in 1800, and then declined, reaching 1.74 in 1976, and thereafter fluctuating up and down; it is now 1.64, close to the level it was at in 1976 (Raftery 2021). 13 The general trend since 1840 has been that life expectancy has increased steadily (Oeppen and Vaupel 2002), with slower increases for countries with the lowest and highest life expectancy, and the fastest increases for countries in the middle. 12 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson considerable, appropriately, reflecting the very long forecast time horizon, with a median forecast of 7.5 billion, but a 95% interval from 2.3 to 25.8 billion. The results agree closely with the UN forecasts for the period to 2100 (United Nations 2019). Figure 3 also shows the results for each major continental region. They show that the populations of Asia, Europe and Latin America are likely to peak well before the end of this century and then decline substantially. The populations of Africa and Northern America are also likely to peak and then decline, but much later, in the 22nd century. In the case of Africa this is due to population momentum (with a high fraction of the population currently in reproductive ages) and current high fertility. In the case of Northern America it is due to a combination of modest population momentum, fertility that is closer to replacement level than in other continents, and immigration. Uncertainty for each region in 2300 is high. In comparison to the population projections from the SSPs, our population projections are centered around a peak of slightly over 10 billion people globally reached late this century, lying closest to SSP2, although SSP2 levels off at a higher level than our median projection after 2200. Through 2300, the 90% confidence distribution around our median is narrower than the range indicated by the SSPs, and considerably narrower through 2200. SSP1 and SSP5 lie below the 5th percentile of our distribution through almost the entire time horizon to 2300. SSP3 features a very aggressive population projection in the top tail of the distribution, at about the 99th percentile in 2300. In sum, none of the SSPs has a central tendency for population in line with our fully probabilistic projections, and the range of population given by SSP1-SSP5 is wide relative to ours. We are aware of only three other detailed efforts to project world population to 2300, all of them deterministic, in contrast with our probabilistic method described here. One was carried out by United Nations (2004) and was deterministic, but containing several scenarios. The range of these projections for 2300 from the different scenarios went from 2.3 to 36.4 billion, compared with our 98% prediction interval of 1.7 to 33.9 billion. Although using 13 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson different methodologies and carried out over 15 years apart, the two sets of projections give results that are compatible with one another, perhaps to a surprising extent.14 Another such exercise was carried out by Vallin & Caselli (1997), also deterministic with three scenarios corresponding to different long-term trajectories of world TFR. Two of the scenarios led to world population stabilizing at around 9 billion, while the other resulted in 4.3 billion people in 2300. All three of these scenarios give world population in 2300 well within our 80% interval, though with a range that is much narrower than either ours or that of United Nations (2004). Gietel-Basten and others ( 2013) also performed a projection exercise to 2300, with a very wide range of scenarios for long-term world TFR. They obtained projections of global population yielding anything from zero to 86 billion in 2300.15 Figure 3. Probabilistic Population Projections for World and Major Regions, to 2300 Notes. Data prior to 2020 are from the UN’s World Population Prospects 2019 (UN 2019). The predictive medians are shown as solid curves; the shaded areas show the 90% and 98% predictive intervals. The world population projections from the extended SSPs are shown for comparison. Probabilistic Economic Growth Projections to 2300 and Economic Growth Survey 14 The very high upper bound for the UN (2004) projections is likely an artifact due to the perfect correlation implied by the deterministic scenarios and the aggregation of such results. 15 As in the UN (2004) projections, these very extreme outcomes are likely in part due to the perfect correlation between countries implied by the deterministic scenarios and the aggregation of such results. 14 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson Methods. The probabilistic projections of economic growth often used in analyses by governments and the private sector have not incorporated the time scale of centuries, as is needed to support SCC estimates and other economic analyses of climate change. MSW (2019) took a significant step forward by providing probabilistic econometric projections over long periods. The MSW methodology involves a multifactor Bayesian dynamic model in which each country’s GDP per capita is based on a global frontier of developed economies (countries in the Organisation for Economic Co-operation and Development, OECD) and country-specific deviations from that frontier. Correlations between countries are also captured in a hierarchical structure that models countries in “covariance clubs,” in which country-level deviations from the frontier vary together. The hierarchical structure also permits pooling information across countries, an approach that tightens prediction intervals. This model is then estimated on data for 113 countries over 118 years (1900 to 2017). The model yields 2,000 sets of trajectories of country-level GDP per capita from 2018 to 2300. Each can be considered an equally likely uncertain future. Each is characterized by a path for the global factor and 113 country-specific deviations from that pathway. The results are described more fully below; for more information about the model, see MSW (2019). As noted earlier, however, NASEM (2017) recommended augmenting statistical models with formal expert elicitation to quantify uncertainty, especially for long-term projections. But surveying experts on long-term uncertainty of economic growth at the country level is impractical because of time constraints and the difficulty of accounting for intercountry correlations. Consequently, our study was designed to work in tandem with an econometric model that provides country-level projections and represents the intercountry dynamics. Our Economic Growth Survey (EGS) focused on quantifying uncertainty for a representative frontier of economic growth in the OECD countries. The results informed econometric projections based on the MSW model of an evolving frontier (also based on the OECD), in turn providing countrylevel, long-run probabilistic projections. The methodology we applied is the “classical model” (Cooke 1991, 2013) of structured expert judgment, analogous to classical hypothesis testing. In essence, the experts are treated as statistical hypotheses: they are scored on their ability to assess uncertainty based on their responses to calibration questions whose true values are known to us but unknown to the experts. This scoring allows us to weight the experts’ judgments, and the scores of combinations of 15 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson experts serve to gauge and validate the combination that is adopted. The ability to performanceweight experts’ combined judgments has generally been shown to provide the advantages of narrower overall uncertainty distributions with greater statistical accuracy and improved performance both in and out of sample (Colson and Cooke 2017, 2018; Cooke and others 2021). Ten experts, selected for their expertise in macroeconomics and economic growth and recommended by their peers, were elicited individually by videoconference in roughly two-hour interviews in 2019–2020. They received an honorarium where appropriate. The full elicitation protocol is available in the online appendix; the general process was as follows. First, experts quantified their uncertainty for several initial questions, after which answers were provided for self-assessment; this step was intended to familiarize them with the process and alert them to potential biases. The experts then provided a median and 90 percent confidence range for 11 calibration questions for which the true values were known to us. Experts next provided their 1st, 5th, 50th, 95th, and 99th quantiles for the variables of interest: levels of OECD GDP per capita for 2050, 2100, 2200, and 2300. For experts more comfortable working in growth rates (rather than levels), we provided a spreadsheet tool that translated average growth rates into GDP per capita levels. Experts were informed that their combined quantiles of GDP levels would be combined with country-level econometric projections, as described below, but they were not shown the results. Experts were given historical data on economic growth to provide a consistent baseline of information across the panel, and they were permitted to consult outside sources if desired. Experts provided additional rationale for their quantiles verbally throughout the elicitation and concluded the survey by formally identifying the primary factors driving their low and high future growth scenarios. Given that the projections were being used as an input to the estimation of climate change damages, which would reduce economic activity below the projected level, experts were specifically asked to provide quantiles of economic growth absent the effects of further climate change as well as absent further policy efforts to reduce emissions. Two of the 10 experts provided a pair of modified base quantiles to reflect the absence of effects from climate damages and climate policy, but in general the proposed modifications to their original distributions were minor. Moreover, several experts noted that although climate change was a primary driver of their low growth projections, the complexity of the uncertainties represented in their base 16 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson quantiles precluded their systematic removal, and they deemed their base quantiles appropriate for assessing uncertainty in analyses intended to include the effects of climate change. The results of the expert elicitations were combined by first fitting each expert’s five quantiles for each year, in log GDP per capita, with a Johnson SU distribution (Johnson 1949) to generate a continuous cumulative distribution function (CDF) specific to each expert. We next combined the CDFs in two ways: averaging across the set of expert CDFs with equal weight, and performance-weighting the experts according to their performance on the calibration questions. This process yielded a pair of final combined elicited values of OECD GDP per capita for each elicited year and quantile.16 Results of Economic Growth Survey. On the calibration questions (see online appendix), the experts demonstrated an overall high level of statistical accuracy compared with other structured expert judgment studies and results that are robust against expert loss. As shown by their individual quantiles (Figure 4) and as expressed in comments during the videoconferences, most participants’ median forecast was that long-term growth would be lower than the growth rate of the past 100 years. The responses show considerable diversity in their characterization of uncertainty around the median, however, with some of the widest ranges being driven by their explicit inclusion of events that are not present or fully realized in the historical record of economic growth on which statistical growth projections are based.17 When asked to identify the primary drivers of the low-growth quantiles, the experts most commonly responded climate change, followed by world conflict, natural catastrophes, and global health crises. Rapid advancement of technology was cited most often as the primary driver of high growth, followed by regional cooperation and advances in medical science. Many experts expected that technology breakthroughs in clean energy would dramatically lower global emissions. Implicit in this narrative is a negative correlation between economic growth and carbon dioxide emissions. As shown in Figure 4, both the performance-weighted and the equal-weighted combinations of the experts’ distributions yield narrower ranges as well as lower medians than 16 See online appendix for further detail. 17 The quantiles from one expert included global civilization-ending events that were outside the scope of the survey and incompatible with assumptions for US federal policy analysis; they unreasonably distorted the combined distributions toward extreme values. Quantiles from this expert were excluded in the final survey. 17 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson do the statistical trajectories for all four years (2050, 2100, 2200, and 2300). The median of the equal-weighted combination is consistently higher than that based on performance weighting, but the difference shrinks throughout the period until the medians nearly converge in 2300. Overall, the experts viewed sustained long-term growth rates above 4 percent or even slightly below 0 percent as highly unlikely but not impossible. Results of econometric growth projections augmented with expert judgment. We used the EGS results to modify econometric projections of GDP per capita based on the MSW (2019) methodology and generate density functions of internally consistent projections of economic growth at the country level. As indicated in MSW (2019), economic growth 100 to 300 years into the future is highly uncertain, well beyond that captured in typical scenario projections (see Figure 5 below). Figure 4. Distributions of Future Average GDP Growth Rates for OECD from Experts and Econometric (MSW) Sources Note. For each bar, the circle shows the median and the lines show the 1st, 5th, 95th, and 99th percentiles of the relevant distribution. The tails of the MSW distribution are quite wide, leading to some implausibly small or implausibly high long-term average growth rates in the extreme tails (e.g., below the 1st 18 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson percentile or above the 99th percentile). These extreme tails correspond to extremes of persistent economic growth beyond what has been observed historically over long periods (e.g., below –1 percent or above +5 percent annually on average through 2300). Specifically, according to the Maddison Project dataset (one of two datasets used by MSW),18 which includes country-level GDP per capita data as far back as 1500 for some countries, no country has experienced such extreme growth for such long periods.19 In the MSW model, those extreme tail simulated outcomes are driven by the structure of the Bayesian model with its embedded distributional assumptions, rather than by the historical data used to estimate the model. Further, the 1st and 99th percentiles of the combined distribution of long-run growth rates based on the EGS are –0.6 percent and +4.4 percent, indicating that long-run growth rates are unlikely to fall outside this range. For these reasons, and in consultation with James Stock (an MSW coauthor), we omit some projections in the extreme tails of the MSW distribution that are outside the range of historical experience and also outside the long-run range implied by the EGS (see online appendix for our approach). Hereafter, we refer to this censored MSW version as the “MSW projections,” while noting that it differs slightly in the extreme tails. The EGS provides quantiles of economic growth for the OECD for four discrete years. To maintain the rich country-level information of the econometric model while incorporating the information from the experts, we reweight the probability of occurrence of each of the 2,000 MSW draws to satisfy the experts’ combined distribution over the long run. The underlying projections from MSW remain unchanged, but the likelihood of drawing a given trajectory is modified such that the quantiles of OECD growth reflect the distribution produced by the EGS. We accomplish this reweighting in two steps. First, we generate a set of target quantiles for the years 2030, 2050, 2100, 2200, and 2300 by calculating weighted averages of the combined expert quantiles and the raw MSW quantiles. The NASEM report recommended giving expert judgment increasing weight for longer horizons, so the near-term weighting is governed more by historical evidence and that of the long-term future more by the experts. For 18 Available at https://clio-infra.eu/Indicators/GDPperCapita.html. 19 For example, no country in Maddison Project data has observed 100-year growth rates below –1 percent or above +3 percent. 19 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson this reason, we increase the weight of the EGS quantiles versus the MSW quantiles linearly over time from 0 percent in 2030 to 100 percent in 2200 and thereafter. We then use iterative proportional fitting (Csiszar 1975) to impose the target quantiles for OECD growth on the 2,000 trajectories from MSW for each of the four benchmark years. For each range of values between each elicited quantile, this algorithm reassigns probabilities to each trajectory whose value falls within that range by minimizing a penalty for nonequal weights, subject to matching the target quantiles. Because there are four years for which we have a combined expert distribution to satisfy, the algorithm iterates between each year until all years’ distributions are satisfied. Figure 5 compares the resulting distributions from MSW with those reweighted according to the EGS. 20 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson Figure 5. Average Projected Growth Rates of GDP per Capita in OECD Countries Notes. Adapted from MSW (2019) and an EGS performance-weighted average of those data. Shaded areas and dashed/dotted lines represent 5th to 95th (darker, dashed) and 1st to 99th (lighter, dotted) percentile ranges. 21 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson Figure 6. Average Projected Growth Rates of Global GDP per Capita Note. The solid line represents the median value, and dark and light shading represent the 5th to 95th (darker) and 1st to 99th (lighter) percentile ranges of the RFF-SPs. We next generate a distribution of projected global GDP per capita rates by taking 10,000 independent samples from the population and EGS projections, taking the product of population and GDP per capita at the country level, summing to yield global GDP, and dividing by the global population for that draw. Figure 6 shows that the resulting median global GDP growth rates from the RFF-SPs track slightly higher than SSP3, with SSP1, SSP2, and SSP5 also falling within the 90th percentile range. The SSPs do not span the full range of potential growth paths, especially below the median for the RFF-SP growth trajectories. As will be discussed in section V, these relatively low growth potential paths contribute substantially to the SCC. Projected Emissions to 2300 Based on Economic Growth: Future Emissions Survey Methods. To generate very long-run distributions of global emissions of CO2, CH4, and N2O, our Future Emissions Survey (FES) elicited 10 experts in socioeconomic projections and climate policy who were nominated by their peers and/or by members of the RFF Scientific Advisory Board. The experts surveyed were based at universities, nonprofit research institutions, and multilateral international organizations. They have expertise in, and have undertaken, longterm projections of the energy-economic system under a substantial range of climate change mitigation scenarios. Like our Economic Growth Survey, the FES employed the classical model of structured expert judgment: experts first quantified their uncertainty about variables for which true values were known, for calibration and performance weighting. Experts next provided quantiles of uncertainty (minimum, 5th, 50th, 95th, maximum, as well as additional percentiles at the expert’s discretion) for four variables for a case we called Evolving Policies, which incorporates views about changes in technology, fuel use, and other conditions, and consistent with the expert’s views on the evolution of future policy. The Evolving Policies case corresponds to the USG approach to benefit cost analysis, which evaluates US regulations as incremental against a more expansive backdrop of other policies and conditions and is responsive to NASEM recommendations for including future background policy in the uncertain distributions of socioeconomic projections. 22 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson Experts provided quantiles of uncertainty for (1) fossil fuel and process-related CO2 emissions; (2) changes in natural CO2 stocks and negative-emissions technologies; (3) CH4; and (4) N2O, for five benchmark years: 2050, 2100, 2150, 2200, and 2300. For category 1, they were also asked to indicate the sensitivity of emissions to five GDP per capita trajectories.20 For each expert we generate a set of cumulative distribution functions (CDFs), one for each benchmark year, emissions source, and economic growth trajectory, by piecewise linear interpolation between the quantiles provided. Then, as in the EGS, we generate a corresponding set of combined equal-weight CDFs by averaging the CDFs in equal measure, and a set of performance-weighted CDFs by averaging in accordance with the experts’ relative performance on the calibration questions. Quantile values from the combined CDFs were linearly interpolated in time between each of the benchmark years to yield a distribution of piecewise linear, nonoverlapping trajectories for each emissions source and sink. Based on the FES, we developed a distribution of emissions scenarios to pair, 1-1, with our economic growth scenarios. First, we sampled from one of 10,000 economic growth trajectories, described above. Second, we sampled a value (q) on the continuous interval [0,1] to determine the percentile of the expert’s emissions trajectory to evaluate. Third, at five-year intervals from 2025 to 2300 we generated an interpolated value of the qth percentile of emissions based on the realized GDP level corresponding to that GDP trajectory in that year, and the qth percentile of the experts’ emissions distributions for the bounding GDP values elicited. Net emissions of CO2 were generated by sampling independent q values for direct emissions (category 1) and natural carbon stocks and negative emissions technologies (category 2) and summing the resulting trajectories, thereby including the possibility of net negative emissions.21 20 See online appendix for a more detailed discussion of the survey methodology and the full elicitation protocol. 21 The experts received real-time feedback about the implications of their prescribed distributions for future outcomes. After each had provided a full set of quantiles, we followed the same sampling process described above to generate distributions of emissions trajectories, except that the emissions distributions were based on input provided by only that expert rather than the full set of experts, and that for expediency we presented results based on 100 to 1,000 samples at the discretion of the expert. Experts were shown their full distributions of emissions trajectories, the economic growth paths sampled, population, emissions intensity, and the resulting climate outcomes from the FaIR 2.0 climate model (described in section III) for their verification. They were permitted to modify their quantiles after seeing their distributions and resulting climate outputs, but in general they found the results to be in agreement with the intent of their quantiles and consistent with their supporting rationale. 23 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson Results of the Future Emissions Survey. Experts’ performance on the calibration questions was high, as measured by statistical accuracy, informativeness, and robustness of results (see online appendix). Experts described their rationale and the conditions supporting their distributions of emissions, often citing the same factors. For direct CO2 emissions (category 1), experts viewed low economic growth as likely to reduce emissions overall but also lead to reduced global ambition in climate policy and slower progress to decarbonization. For median economic growth conditions, experts generally viewed policy and technology evolution as the primary driver of their emissions distributions, often offering a median estimate indicating reductions from current levels but with a wide range of uncertainty. Several experts said high economic growth would increase emissions through at least 2050, most likely followed by rapid and complete decarbonization, but with a small chance of substantial continued increases in emissions. In general, the distributions were inconsistent with keeping global temperature increases below 1.5 degrees C, even when considering the potential for negative emissions. 22 Though their rationales were often similar, experts’ interpretation of those narratives, as shown in their quantiles of emissions, differed substantially (Figure 7). For example, for the median growth trajectory to 2050, the median emissions ranged from 15 to 45 Gt CO2, a span encompassing a decrease of more than 50 percent to an increase of more than 30 percent from today’s levels. Experts often provided highly skewed distributions, with significant chances that direct CO2 emissions (category 1) would be exactly or near zero while allowing for much higher emissions in the middle and upper quantiles of their distribution.23 22 See online appendix for a more detailed summary of the rationales of the experts, including discussion of emissions from the other categories. 23 See online appendix for results for additional years and gases. 24 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson Figure 7. Cumulative Distribution Functions (CDFs) of Annual CO2 Emissions Panel (A): Individual Expert and Combined CDFs for 2050 Panel (B): Combined CDFs of Expert Projections for 2050 to 2300 The experts’ narratives show an evolution of the combined distributions (Figure 8). Over time, emissions distributions for all growth trajectories exhibit a shift, particularly evident for the median and high-growth trajectories, with median emissions approaching zero in and after 2150. Emissions distributions for the lower-growth trajectory show a decreased range of emissions overall compared with the higher-growth trajectories, but the temporal trend toward lower emissions is not as strong. Higher-growth trajectories show relatively greater probabilities of increased emissions in the near term, followed by greater chances of full decarbonization in the next century, while also allowing for the possibility of much higher emissions over the long term. (For the other categories of emissions sources and sinks, see online appendix). Resulting global greenhouse gas emissions projections. Figure 8 shows the resulting distribution of projected net CO2 emissions based on the FES. The median emissions trajectory is a roughly 50 percent decrease from today’s levels by 2100, followed by slowly decreasing levels that approach but do not reach net zero. The median of our CO2 emissions and concentrations 25 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson paths is similar to SSP2, and the 98 percent confidence interval spans a range similar to that of SSP1 through SSP3, at least through 2140.24 The magnitude of CO2 emissions associated with SSP5, however, is considerably higher than the upper end (99th percentile) of our distribution through the middle of the next century, consistent with the findings of Raftery and others (2017) and Liu and Raftery (2021). Beyond the middle of the next century, all the SSP emissions trajectories increasingly lie well within our distribution because their extension beyond 2100 is constructed to achieve zero emissions by 2250. This is a weakness of the SSPs as a basis for SCC estimation, even if a subset of the SSPs spans a “reasonable range” during this century. For CH4 (Figure OA-9), the emissions and concentrations distribution resulting from the FES is centered between SSP2 and SSP5 and spans a range similar to that of SSP1–SSP5, at least through 2100. After that point, as with CO2, the emissions range spanned by the SSPs narrows, whereas the FES CH4 emissions maintain a relatively wide distribution, similar to that in 2100. For N2O (Figure OA-10), the centers of the FES emissions and concentrations paths is similar to that of SSP5, and the full distribution from the FES spans a range wider than all the SSPs. In sum, no single SSP is centered similarly to the FES median emissions paths across all three major greenhouse gases. The full range of emissions represented by the SSPs is higher than for the FES for CO2 through 2140, by construction narrows to zero for CO2 after that point, and is narrower than the FES results for CH4 and N2O after 2100. 24 For comparison of emissions consistent with the SSPs beyond 2100, we adopt the commonly used extensions provided by the Reduced Complexity Model Intercomparison Project (Nicholls and others 2020). 26 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson Figure 8. Net Annual Emissions of CO2 from RFF-SPs and SSPs Notes. Lines represent median values, and dark and light shading represent the 5th to 95th (darker) and 1st to 99th (lighter) percentile ranges of the RFF-SPs. III. From Emissions to Monetized Climate Damages Climate system methods. The second step in estimating the SCC is using a climate model to calculate changes in the climate system corresponding to changes in GHG emissions. Climate models vary in their representation of the underlying physics, in their spatial and temporal resolution, and in their computational requirements. Earth system models, such as those used for IPCC analyses, require supercomputers, but SCC calculations, typically generated from tens to hundreds of thousands of samples to characterize their uncertainty, preclude use of fullscale earth system models. SCC models are designed to emulate the response of full earth system models across a subset of relevant climate outputs, such as globally averaged surface temperature. Previous SCC calculations from the federal government used the integrated assessment models DICE, FUND, and PAGE, each of which employs its own reduced-form climate model. These IAMs can deliver substantially different temperature increases for the same pulse of emissions (Rose and others 2014), leading to inconsistency when results are averaged to 27 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson calculate the SCC. The NASEM report therefore recommended adopting a uniform climate model that met certain criteria, including that it generate a distribution of outputs across key climate metrics comparable to distributions of outputs from the full earth system models. The Finite Amplitude Impulse Response (FaIR, Millar and others 2017) model was highlighted in the NASEM report as a reduced-form model that met the criteria. To assess the changes in global mean surface temperatures resulting from the RFF-SPs, we ran the latest version, FaIR 2.0 (Leach and others 2021), using 10,000 draws from the emissions trajectories of CO2, CH4, and N2O while also sampling across FaIR’s native uncertainty in climate variables.25 Resulting temperature change from RFF-SPs. Figure 9 shows the median temperature trajectory associated with the RFF-SPs: increases reaching nearly 3 degrees C above the average global temperature for 1850–1900 (the standard IPCC preindustrial benchmark) through 2100, and continued increases through 2300. The low end of the distribution indicates a roughly 5 percent chance that the increase will remain below 2 degrees C through 2100. Our experts’ expectations for negative-emissions technologies lead to an increasing chance of drawing down atmospheric CO2 to yield temperatures at current levels and below by the late 2100s. The RFF-SP median temperature trajectory tracks closely with SSP2 through 2200, thereafter continuing to increase slightly. SSP1 is largely consistent with the 5th percentile results throughout the period. Temperatures resulting from SSP3 emissions are consistent with the 95th percentile of the RFF-SPs through the middle of the next century, at which point temperatures stop increasing, by construction. The median temperature from SSP5 is roughly consistent with the 99th percentile of temperatures from the RFF-SPs through 2100, at which point it begins to level off to meet the imposed requirement for net-zero emissions by 2250. 25 Trajectories for non-CO2, CH4, and N2O were drawn from SSP2. 28 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson Figure 9. Global Mean Surface Temperature Change from RFF-SPs and SSPs Notes. Temperature change is relative to the standard 1850–1900 preindustrial average. Solid lines represent median values. Dark and light shading represent the 5th to 95th (darker) and 1st to 99th (lighter) percentile ranges based on the RFF-SPs. For clarity of presentation, uncertainty in the climate system is reflected in the uncertainty range only for the RFF-SPs (and not the SSPs). In this comparison, uncertainty in the climate system itself, as represented by the uncertain distributions of climate parameters in the FaIR model, contributes significant uncertainty to the range of projected temperatures. The temperature distributions for the RFFSPs include climate uncertainty from FaIR, but for clarity we omit climate system uncertainty in presenting projected temperatures from the SSPs. For a sense of scale, the 90th percentile range in temperatures from FaIR in 2300 for SSP5 is about –2.5 to +7 degrees C about the median. Methods for climate damage estimation. The third step in estimating the SCC is translating changes in the climate system, such as temperature, into total economic damages over time. Damages can be calculated by estimating costs for various sectors (e.g., human health and mortality, agriculture, energy usage, coastal flooding) and summing them, or by taking an aggregate approach to estimate damages across the economy as a whole. 29 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson Recent advances in methodologies for damage estimation are not reflected in the IAMs used by the federal government to calculate the SCC (NASEM 2017; Diaz and Moore 2017). The NASEM report made recommendations on improving sectoral damage estimation, finding sufficient peer-reviewed research to support updates on human health and mortality, agriculture, coastal inundation, and energy demand. Since the report was issued, the literature addressing specific sectors has grown. Nevertheless, few studies meet the full requirements (e.g., global coverage with regional detail, translation into economic damages) put forward by Diaz and Moore (2017b) or Raimi (2021) to serve as the basis for an updated damage function for the SCC. For example, two independent, comprehensive reviews (Bressler 2021; Raimi 2021) found just three suitable studies (World Health Organization 2014; Gasparrini and others 2017; Carleton and others 2018). Our own further assessment of the damages literature found two candidates for agricultural damages (Moore and others 2017; Calvin and others 2020), two for energy demand (Clarke and others 2018; Ashwin and others 2021), and one for coastal damages (Diaz 2016). Among the notable additions, the Climate Impact Lab (CIL) has developed a methodology to generate empirically derived, hyper-localized damage functions accounting for adaptation. The CIL in its research has been applying its methodology across a comprehensive set of sectors including health, agriculture, labor, energy, conflict, coastal, and migration (Carleton and others 2018). Upon completion, this full set of sectors is intended to support fully empirically based climate damage estimates. Much of the new sectoral damages research identified here is currently under peer-review for publication, and efforts to implement the existing peer-reviewed studies will similarly be completed on a timeline that is compatible with the IWG process to update the SCC. As described below, for the purposes of this paper we have deployed the aggregate global climate damage function from the widely used DICE model (Nordhaus 2017b) to develop illustrative SCC estimates, coupled with the RFF-SPs, the FaIR climate model, and the stochastic discounting approach described in the next section. 30 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson IV. Discounting Approaches for the Social Cost of Greenhouse Gases The long residence time of CO2 in the atmosphere implies that today’s emissions will have consequences for centuries. This time horizon makes the discount rate a major factor for the SCC. For example, the IWG’s 2021 interim SCC estimate is $51/ton with a 3 percent discount rate (IWG SCC 2021) but would be about $121/ton at a 2 percent discount rate (RFF and NYSERDA 2021). That 1 percentage point difference alone would more than double the SCC and, by implication, greatly strengthen the economic rationale for substantial emissions reductions. The discount rates used in federal regulatory analysis are guided by Circular A-4, issued by the Office of Management and Budget (OMB) in 2003, which endorses rates of 3 percent and 7 percent reflecting, respectively, consumption and investment rates of return. OMB guidance also allows for additional sensitivity analysis in cases with intergenerational consequences, such as climate change. However, this guidance runs counter to current economic thought and evidence, for three reasons: (1) a constant deterministic discount rate becomes increasingly problematic for long-horizon problems (Weitzman 1998); (2) benchmarks for the consumption rate of interest (currently 3 percent) have declined substantially over the past two decades (CEA 2017; Bauer and Rudebusch 2020, 2021); and (3) the rationale for 7 percent—to address possible policy effects on capital—is flawed in ways that are magnified for very long-term decisions (Li and Pizer 2021). The NASEM (2017) report and recent technical guidance on the SCC (IWG SCC 2021) acknowledged those concerns. A 2021 executive order26 directed OMB to reassess existing practice and consider “the interests of future generations” in revisions to Circular A-4. Alongside issues related to empirical discount rate uncertainty over long time horizons, the comparison of welfare across generations creates an ethical concern dating back at least as far as Ramsey (1928): do we discount the welfare of future generations simply because they are born later? One rationale for a modified discounting approach stems from uncertainty in the discount rate, which tends to lead to declining future discount rates. Weitzman (1998) showed that if one 26 https://www.whitehouse.gov/briefing-room/presidential-actions/2021/01/20/modernizing-regulatoryreview/, Sec. 2. 31 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson is uncertain about the future trajectory of (risk-free) discount rates, and uncertain shocks to the discount rate are persistent, the certainty-equivalent (risk-free) discount rate declines with the time horizon toward the lowest possible rate. This result stems from a straightforward application of Jensen’s inequality to a stochastic discount factor, leading to declining (risk-free) discount rates (Arrow and others 2014). At the same time, if the payoffs to investments in emissions reductions are correlated with future income, the effective risk-adjusted rate could be higher if the correlation is positive, or lower if it is negative (Gollier 2014). This correlation is often termed the “climate beta,” but it is not clear ex ante whether the beta is positive, as in Nordhaus’s work and as argued by Dietz and others (2018), or negative, as in Lemoine (2021). The second argument for changing the government’s discounting approach is the systemic decline in observed interest rates over at least the past two decades (Kiley 2020; Del Negro and others 2017; Johannsen and Mertens 2016; Laubach and Williams 2016; Caballero and others 2017; J. H. E. Christensen and Rudebusch 2019; CEA 2017; Rachel and Summers 2019; Bauer and Rudebusch 2020, 2021), which along with other research (Giglio and others 2015a, 2015b; Drupp and others 2018; Carleton and Greenstone 2021) has led to calls for using a lower discount rate; 2 percent is often suggested. The third issue is the need, in light of recent research (Li and Pizer 2021), to rethink the use of the higher discount rate (7 percent) reflecting the return to capital. Several decades ago, researchers suggested that when taxes create a wedge between consumption and investment interest rates, the alternative rates could be used to bound a benefit-cost analysis, as a shorthand version of the shadow price of capital (SPC) approach (Harberger 1972; Sandmo and Drèze 1971; Marglin 1963a, 1963b; Drèze 1974; Sjaastad and Wisecarver 1977). However, the assumptions underlying the soundness of that approach are quite restrictive: costs are assumed to occur entirely in the first period; benefits are constant and occur either in a single period or in perpetuity; and benefits displace only consumption while costs displace either investment or consumption. Li and Pizer (2021) extend Bradford (1975) to show that the SPC approach yields a range of estimates centered on the results simply using a consumption interest rate. The NASEM (2017) report foreshadowed those results and recommended using a central consumption rate estimate along with high and low sensitivity cases. Newell and others (2021) provide some guidance, suggesting a central value between 2 and 3 percent and high and low values between 1.5 percent and 5 percent (though they do not recommend those particular 32 BPEA FA21 Rennert, Prest, Pizer, Newell, Anthoff, Kingdon, Rennels, Cooke, Raftery, Ševčíková, Errickson values). Their discussion of discount rates is based primarily on questions of the most appropriate near-term consumption rate and does not address the SPC approach. Pizer (2021) details how the SPC approach could be implemented, suggesting sensitivity cases around a central consumption discount rate, with benefits and costs alternately multiplied by a particular shadow price of capital; 1.2 percent is proposed as a conservative value. This approach to discounting sensitivity analysis would therefore include (1) a central case with a central consumption discount rate value and no use of the SPC; (2) sensitivities around this central case with low and high consumption discount rates; and (3) sensitivities around this central case with the SPC applied alternately to costs and benefits using a central discount rate. The sensitivity analysis (3) would be included only when the SCC is implemented in a particular benefit-cost analysis. Each of those discounting ideas (including stochastic growth discounting, discussed below) could be incorporated in a revision to Circular A-4, with relevance to both SCC estimation and other contexts. This would harmonize SCC discounting and broader US government guidance on benefit-cost analysis. Stochastic Growth Discounting with Economic Uncertainty One rationale for discounting, generally, is the concept of declining marginal utility of consumption. Intuitively, a $100 cost in a future in which society has grown dramatically wealthier should be valued less, from today’s perspective, than the same $100 cost in a relatively poor future with stagnant economic growth. This result is often embodied by the classic equation derived in Ramsey (1928) that relates the consumption discount rate (