Original Paper ARFIMA Reference Forecasts for Worldwide CO2 Emissions and the Need for Large and Frontloaded Decarbonization Policies

We provide reference forecasts for worldwide CO2 emissions from fuel fossil combustion and cement production based on an ARFIMA approach. Our projections suggest that for the IPCC goals to be achieved it is necessary to reduce emissions by 57.4% and 97.4% of 2010 emissions by 2030 and 2050, respectively. Furthermore, the bulk of these efforts have to take place by 2030. Finally, the presence in the data of long memory suggests that policies must be persistent to ensure permanent reductions in emissions. These results add to the sense of urgency in dealing with the issue of decarbonization.

not stationary, then even transitory policies will have permanent effects on emissions, and a steady policy stance is less critical. Long memory is an intermediate case in which the policy stance needs to be steady but the effects of transitory policies are long lasting.
The remainder of this paper is organized as follows. Section 2 presents and describes the data set.
Section 3 provides a brief technical description of the methodology used. Section 4 discusses the empirical findings, considering first the fractional integration analysis and then the accuracy of in-sample forecasts. Section 5 presents and discusses our reference forecasts vis-à -vis the IPCC new targets. Finally, section 6 provides a summary of the results, and discusses their policy implications.

Data Sources
In this paper, we use annual data for global CO 2 emissions for the period between 1950 and 2017. The data until 2014 is from the Carbon-Dioxide Information Analysis Centre (see Boden et al., 2017). This data set contains information going back to 1870. Nevertheless, in this work we have elected to work only with data starting in 1950, given the profound structural changes that occurred after World War II.
In turn, data for 2015-2017 is from the national emissions inventories collected by the United Nations (see UNFCCN, 2018) and reported by Global Carbon Atlas (2019).
Aggregate CO 2 emissions are the sum of five components: emissions from burning fossil fuelssolid, liquid, gas, and gas flaring, and emissions from cement production. The data do not consider emissions from land use, nor from land-use change and forestry. All variables are measured in million metric tonnes of carbon per year (Mt, hereafter), and were converted into units of carbon dioxide by multiplying the original data by 3.664, the ratio of the two atomic weights. Table 1 presents summary information about our data. The first column shows the value of the global CO 2 emissions in the first year of each decade while the remaining columns show the mean shares in total emissions for the decade of each of the emissions sources.

Description of the Data
Over the sample period, worldwide CO 2 emissions grew consistently. Nevertheless, we can identify two distinct periods. Between 1950 and1980, global

Fractionally-Integrated Processes
A fractionally-integrated process is a stochastic process with a degree of integration that is a fractional number, and whose autocorrelations decay slowly at a hyperbolic rate of decay. Accordingly, fractionally-integrated processes display long-run rather than short-term dependence and for that reason are also known as long-memory processes A time series is said to be fractionally integrated of order , if it can be represented by where, is the coefficients vector, represents all deterministic factors of the process, , and 1, 2, … , is the lag operator, is a real number that captures the long-run effect, and is (0). Allowing for values of " " in the interval between 0 and 1 gives an extra flexibility that may be important when modeling long-term dependence in the conditional mean. Indeed, in contrast to an (0) time series (where 0) in which shocks die out at an exponential rate, or an (1) process (where 1) in which there is no mean reversion, shocks to the conditional mean of an ( ) time series with 0 < < 1 dissipate at a slow hyperbolic rate. More specifically, if 0.5 < < 0, the autocorrelation function decays at a slower hyperbolic rate but the process can be called anti-persistent, or, alternatively, to have rebounding behavior or negative correlation. If 0 < < 0.5 , the process reverts to its mean but the auto-covariance function decreases slowly as a result of the strong dependence on past values. Nevertheless, the effects will last longer than in the pure stationary case ( 0). If 0.5 < < 1, the process is non-stationary with a time-dependent variance, but the series retains its mean-reverting property. Finally, if ≥ 1 , the process is non-stationary and non-mean-reverting, i.e., the effects of random shocks are permanent (for details see, for example, Granger & Joyeux, 1980;Granger, 1980Granger, , 1981Sowell, 1992aSowell, , 1992bBaillie, 1996;Palma, 2007;Hassler et al., 2016;Belbute & Pereira, 2015).

ARFIMA Processes
An ARFIMA model is a generalization of the ARIMA model which frees it from the I(0)/I(1) dichotomy, therefore allowing for the estimation of the degree of integration of the data generating process. In an ARMA process the AR coefficients alone are important to assess whether or not the series is stationary. In the case of the ARFIMA model, the ( ) and ( ) terms are treated as part of the model selection criteria. Accordingly, the ARFIMA approach provides a more comprehensive and yet more parsimonious parameterization of long-memory processes than the ARMA models. Moreover, in the ARFIMA class of models, the short-run and the long-run dynamics is disentangled by modeling the short-run behavior through the conventional ARMA polynomial, while the long run is captures by the fractional differencing parameter, (see, among others, Bollerslev & Mikkelsen, 1996).
If the process in (1) is an ( , ), then the process is an ( , , ) process and can be written as where ( ) 1 1 2 2 … 0 ( ) 1 + 1 + 2 2 + … + 0 are the polynomials of order and respectivelly, with all zeroes of lying outside the unit circle, and with as white noise. Clearly, the process is stationary and invertible for 0.5 < < 0.5.
The estimation of the parameters of the ARFIMA model , , , and 2 is done by the method of maximum likelihood. The log-Gaussian likelihood of given parameter estimates where represents a -dimensional vector of the observations on the process and the covariance matrix has a Toeplitz structure.

ARFIMA Forecasting and Prediction-Accuracy Assessment
Given the symmetry properties of the covariance matrix, can be factored as ′, where Diag( ) and is lower triangular, so that; Moreover, let −1 , ( 1 , 2 , … , ) ′ and is the × upper left sub-matrix of .

Let
. The best linear forecast of +1 based on , 2 , … is Moreover, the best linear predictor of the innovations is ̂ −1 , and the one-step-ahead forecasts for ̂, in matrix notation, is Forecasting is carried out as suggested by Beran (1994) so that ̂+ ̃′̂− 1̂, where ̃ (̂+ −1 , ̂+ −2 , … ,̂). The accuracy of predictions is based on the average squared forecast error, which is computed as There is a wide diversity of loss functions available and their properties vary extensively. Even so, all of these share a common feature, in that "lower is better". That is, a large value indicates a poor forecasting performance, whereas a value close to zero implies an almost-perfect forecast. We use three average loss indicators: the Mean Absolute Percentage Error (MAPE), the Adjusted Mean Absolute Percentage Error (AMAPE), and the U-statist inequality coefficient.
The MAPE and the AMAPE are relative measures, in that they are percentages. In particular, the MAPE is the percentage error, and has the advantage of having a lower bound of zero.

Fractional Integration Analysis
We present in Table 2, the results of the estimations of the ARFIMA( , , ) models. The best specifications were selected using the Schwartz Bayesian Information Criterion (BIC) and include statistically significant autoregressive and moving-average terms. We perform preliminary tests for the existence of structural breaks for all variables following the procedures in Bai-Perron (2003). Test results show no significant evidence for break points. Still, when by simple visual inspection of the data we suspected the possible presence of break points, a dummy variable was included in the ARFIMA models. The corresponding estimated coefficients were never statistically significant and the best specification for ARFIMA models as indicated by the BIC never includes structural breaks.
Our results provide strong empirical evidence for the non-rejection of the presence of long memory for both aggregate CO 2 emissions and each of its five components. The estimated values of the fractional parameter are all between 0 and 1, thus allowing us to reject both the case of pure stationarity model   Table 3 summarizes our in-sample forecasting accuracy results.

In-Sample Global CO 2 Emissions Forecasts
We get excellent in-sample predictions with a MAPE ranging from a minimum of 3.2% for aggregate CO 2 emissions to a maximum of 6.8% for emissions from gas flaring. In addition, the percentage of projected values outside the confidence interval ranges from a minimum of 4.4% for total emissions to a maximum of 7.5% for emissions from gas flaring.
In turn, the U-statistic is very low, varying in a band between 0.01 and 0.05. This suggests that the predictions compare quite well with the observed values. Furthermore, the predictions are non-skewed and show a low variance, which suggests that they closely follow the changes in the observed values.
In fact, more than 94% of the prediction error in all components under analysis is non-systematic.
Finally, the projections based on the aggregate results are very close to the sum of the projections for each of its five components. The difference is, on average, 1% for the in-sample projections discussed here and 4.5% for the out-of-sample projections presented below. This confirms the good performance of our ARFIMA approach.

The ARFIMA Forecasts 2018 -2050
Having established a good in-sample forecasting performance for the ARFIMA estimates, we use these estimates to forecast CO 2 emissions until 2050. We present the results in Figure 2 while in Table 4 we present summary results relative to 2020 reference levels (the detailed results will be provided upon request to the authors).
We forecast total CO 2 emissions to be 37,171 Mt by 2050 after having reached a peak of 37,623 Mt in 2034. The forecasted levels of emissions in 2030 and 2050 are 12.4% and 11.1% above the 2010 reference level, respectively.
From a disaggregated perspective, we forecast CO 2 emissions from liquid fuels, gas fuels, and cement production to be systematically above the 2010 reference levels. In turn, we project emissions from solid fuels and from gas flaring to be below the 2010 reference levels after the first few years into the forecasting horizon.
We forecast CO 2 emissions from liquid fuels to be 14.2% and 15.2% above the 2010 emissions levels by 2030 and 2050, respectively. In turn, the projected emissions from gas fuels are 29.9% and 42.2% above the 2010 level by 2030 and 2050, respectively. The same is true for the projected emissions from cement production, which will reach levels 33.0% and 27.4% above the 2010 levels by 2030 and 2050, respectively.
Conversely, projections of CO 2 emissions from solid fuels follow a decreasing pattern from 2021 onwards. By 2030 and 2050, the projected emissions are 12.9% and 24.3% below the 2010 level. In turn, the projected emissions from gas flaring start declining after 2026 and will reach levels 5.8% and 28.7% below the 2010 levels by 2030 and 2050, respectively. forecasts. Figure 3 provides  Accordingly, the total target accumulated reduction by 2050 corresponds to a reduction of 85.1% in emissions relative to 2010 levels. Of the greatest importance is the comparison of the IPCC policy CO 2 emissions targets with our ARFIMA reference scenario emissions. Since our forecasts capture the statistical information included in the sample, these forecasts provide the most fundamental reference case forecasts. We use these to assess the net policy effort, compatible with the carbon neutrality in 2050.

a) Total CO 2 Emissions b) CO 2 Emissions from Solid Fuels c) CO 2 Emissions from Liquid Fuels d) CO 2 Emissions from Gas Fuels e) CO 2 Emissions from Cement Production f) CO 2 Emissions from Gas Flaring
In order to meet the IPCC target for 2030 a policy effort is necessary that cuts 57.4% emissions relative to 2010 levels. From these, 12.4% corresponds to the extra effort due to the inertia of the natural CO 2 emissions system. In turn, for the 2050 target to be reached an additional effort of 97.4% of 2010 emissions is required. Therefore, the policy efforts for decarbonization are not just very large. They are also larger than implied by the IPCC targets. They are also frontloaded and in a way that clearly exceeds the frontloading already contemplated in the IPCC targets.
The general pattern described above where the inertia of the emission system increases the policy effort required to achieve IPCC targets also applies to emissions from liquid fuels, gas fuels, and from cement production. The policy efforts to achieve decarbonization by 2050 are 118.3% of 2010 levels for emissions from cement production, 114.9% for emissions from gas fuels, and 99.3% for liquid fuels.

Summary, Conclusions, and Policy Implications
This work uses an ARFIMA approach to evaluate the degree of persistence of worldwide CO 2 emissions from solid, liquid, and gas fossil fuel combustion as well as gas flaring and cement production, and to make the corresponding forecasts into 2050. These forecasts, in turn, allow us to assess the policy effort required to meet the IPCC targets.  Our empirical results suggest that CO 2 emissions, both at the aggregate level and for each of its five different components, are fractionally integrated processes. Accordingly, they show long-memory and the effects of shocks tend to dissipate at a slow hyperbolic rate. Emissions from liquid fuels and gas flaring exhibit the weakest degree of long-range dependence while emissions from solid fuels and cement have the strongest.

a) Total CO 2 Emissions b) CO 2 Emissions from Solid Fuels c) CO 2 Emissions from Liquid Fuels d) CO 2 Emissions from Gas Fuels e) CO 2 Emissions from Cement Production f) CO 2 Emissions from Gas Flaring
The long-memory nature of CO 2 emissions implies that any policy shock will have temporary effects albeit longer lasting than suggested in a traditional analysis of stationarity. The mean reversal property, however, implies that the policy effort must be persistent to produce equally persistent effects. This is particularly relevant in the framework of the international strategies for achieving carbon neutrality in 2050 where it will be crucial to promote permanent changes in behavior.
According to our projections, worldwide CO 2 emissions will peak in 2034 and will slowly decline thereafter. Emissions in 2050 will be 11.1% above the 2010 levels. This aggregate pattern is due to the evolution of emissions from liquid fuels and gas. For emissions from combustion of solid fossil fuels, gas flaring, and cement production, we project eventually declining trajectories.
We measure the policy efforts required to decarbonize the world economy as the difference between the reductions of CO 2 emissions required to achieve the nominal IPCC targets and the evolution of emissions as measured by the underling ARFIMA projections. Our results suggest that at the aggregate level, to achieve such policy targets additional deliberate policy efforts are necessary leading to reduction in emissions by 57.4% and of 97.4% of the 2010 levels by 2030 and 2050, respectively.
Accordingly, the necessary policy efforts are very sizeable and more so than suggested by the IPCC targets themselves.
At a disaggregated level, our results also suggest that emissions from liquid fuels, gas fuels and cement production will reach levels clearly above the IPCC 2010 reference value. Accordingly, the inertia underlying the natural emissions system will require policy efforts equivalent to 118.3%, 114.9% and 99.3% of 2010 emissions, respectively, in order to achieve the 2050 target. By contrast, for emissions from solid fuels and flaring gas, inertia will reduce the policy effort required by 72.0% and 80.1% of 2010 levels, respectively.
The policy efforts necessary to reach carbon neutrality by 2050 are not only sizeable. They are also frontloaded. Indeed, about 60% of the necessary reductions have to be achieved in the next decade and just 40% over the next two decades. By contrast, the IPCC targets considered in isolation would seem to imply a share of efforts of about 52.9% over the next decade. The policy efforts for reduction in emissions from liquid fuels are at about the aggregate patterns. In turn, policy efforts for gas fuels and cement production are more frontloaded than indicated by aggregate emissions while those from solid fuel and gas flaring are much less frontloaded.
Finally, our results suggest that the policies toward decarbonization of the economy by 2050 be tailored considering the specific characteristics of each one of the different components of total CO2 emissions both in terms of their magnitude and their trajectory. Given the differences in the inertia of the different types of emissions a one-size fits all approach is not appropriate.
The economic and societal impacts of climate change -on productivity, water resources, transport, energy production and consumption, migration, tourism and leisure, infrastructure, food production capacity, well-being and public health, migration, biodiversity and even political stability -are still far from being fully identified and much less internalized into policy decision making (see Tol (2018)).
Our results contribute to strengthening the need to define and implement transition, adaptation and mitigation policies climate and energy, consistent with the goal of carbon neutrality in 2050, fully aligned with both the goals of the Paris Agreement and the United Nations Sustainable Development Goals. Such policies are urgent, daunting and frontloaded.