New Forecasting Papers 2011-10-06

In this issue we have Forecasting with Approximate Dynamic Factor Models, Forecasting volatility, Testing interval forecasts with a GMM-based approach, and Backtesting Value-at-Risk using Forecasts for Multiple Horizons.

 

  • Forecasting with Approximate Dynamic Factor Models: the Role of Non-Pervasive Shocks

 

Date:

2011-07

By:

Mattéo   Luciani

URL:

http://d.repec.org/n?u=RePEc:eca:wpaper:2013/97308&r=for  

In this   paper we investigate whether accounting for non-pervasive shocks improves the   forecast of a factor model. We compare four models on a large panel of US   quarterly data: factor models, factor models estimated on selected variables,   Bayesian shrinkage, and factor models together with Bayesian shrinkage for   the idiosyncratic component. The results of the forecasting exercise show   that the four approaches considered perform equally well and produce highly   correlated forecasts, meaning that non-pervasive shocks are of no helps in   forecasting. We conclude that comovements captured by factor models are   informative enough to make accurate forecasts.

JEL:

C13

 

  • Forecasting volatility: does continuous time do better than discrete time?

 

Date:

2011-07

By:

Carles   Bretó
  Helena Veiga

URL:

http://d.repec.org/n?u=RePEc:cte:wsrepe:ws112518&r=for  

In this   paper we compare the forecast performance of continuous and discrete-time   volatility models. In discrete time, we consider more than ten GARCH-type   models and an asymmetric autoregressive stochastic volatility model. In   continuous-time, a stochastic volatility model with mean reversion,   volatility feedback and leverage. We estimate each model by maximum   likelihood and evaluate their ability to forecast the two scales realized   volatility, a nonparametric estimate of volatility based on highfrequency   data that minimizes the biases present in realized volatility caused by   microstructure errors. We find that volatility forecasts based on continuous-time   models may outperform those of GARCH-type discrete-time models so that,   besides other merits of continuous-time models, they may be used as a tool   for generating reasonable volatility forecasts. However, within the   stochastic volatility family, we do not find such evidence. We show that   volatility feedback may have serious drawbacks in terms of forecasting and   that an asymmetric disturbance distribution (possibly with heavy tails) might   improve forecasting.

Keywords:

Asymmetry,   Continuous and discrete-time stochastic volatility models, GARCH-type models,   Maximum likelihood via iterated filtering, Particle filter, Volatility   forecasting

 

  • Mapping the state of financial stability

 

Date:

2011-09

By:

Peter   Sarlin (Åbo Akademi University, Turku Centre for Computer Science,   Joukahaisenkatu 3–5, 20520 Turku, Finland.)
  Tuomas A. Peltonen (European Central Bank, Kaiserstrasse 29, D-60311   Frankfurt, Germany.)

URL:

http://d.repec.org/n?u=RePEc:ecb:ecbwps:20111382&r=for  

The paper   uses the Self-Organizing Map for mapping the state of financial stability and   visualizing the sources of systemic risks as well as for predicting systemic   financial crises. The Self-Organizing Financial Stability Map (SOFSM) enables   a two-dimensional representation of a multidimensional financial stability   space that allows disentangling the individual sources impacting on systemic   risks. The SOFSM can be used to monitor macro-financial vulnerabilities by   locating a country in the financial stability cycle: being it either in the   pre-crisis, crisis, post-crisis or tranquil state. In addition, the SOFSM   performs better than or equally well as a logit model in classifying   in-sample data and predicting out-of-sample the global financial crisis that   started in 2007. Model robustness is tested by varying the thresholds of the   models, the policymaker’s preferences, and the forecasting horizons. JEL   Classification: E44, E58, F01, F37, G01.

Keywords:

Systemic   financial crisis, systemic risk, Self-Organizing Map (SOM), visualization,   prediction, macroprudential supervision.

 

  • Testing interval forecasts: a GMM-based approach

 

Date:

2011-08

By:

Elena-Ivona   Dumitrescu (LEO – Laboratoire d’économie d’Orleans – CNRS : UMR6221 –   Université d’Orléans)
  Christophe Hurlin (LEO – Laboratoire d’économie d’Orleans – CNRS : UMR6221 –   Université d’Orléans)
  Jaouad Madkour (LEO – Laboratoire d’économie d’Orleans – CNRS : UMR6221 –   Université d’Orléans)

URL:

http://d.repec.org/n?u=RePEc:hal:wpaper:halshs-00618467&r=for  

This paper   proposes a new evaluation framework for interval forecasts. Our model free   test can be used to evaluate intervals forecasts and High Density Regions,   potentially discontinuous and/or asymmetric. Using a simple J-statistic,   based on the moments de ned by the orthonormal polynomials associated with   the Binomial distribution, this new approach presents many advantages. First,   its implementation is extremely easy. Second, it allows for a separate test   for unconditional coverage, independence and conditional coverage hypotheses.   Third, Monte-Carlo simulations show that for realistic sample sizes, our GMM   test has good small-sample properties. These results are corroborated by an   empirical application on SP500 and Nikkei stock market indexes. It con rms   that using this GMM test leads to major consequences for the ex-post   evaluation of interval forecasts produced by linear versus nonlinear models.

Keywords:

Interval   forecasts, High Density Region, GMM.

 

  • Backtesting Value-at-Risk using Forecasts for Multiple Horizons, a Comment on the Forecast Rationality Tests of A.J. Patton and A. Timmermann

 

Date:

2011-09-20

By:

Lennart F.   Hoogerheide (VU University Amsterdam)
  Francesco Ravazzolo (Norges Bank)
  Herman K. van Dijk (Erasmus University Rotterdam, VU University Amsterdam.)

URL:

http://d.repec.org/n?u=RePEc:dgr:uvatin:20110131&r=for  

Patton and   Timmermann (2011, ‘Forecast Rationality Tests Based on Multi-Horizon Bounds’,   <I>Journal of Business & Economic Statistics</I>,   forthcoming) propose a set of useful tests for forecast rationality or   optimality under squared error loss, including an easily implemented test   based on a regression that only involves (long-horizon and short-horizon)   forecasts and no observations on the target variable. We propose an   extension, a simulation-based procedure that takes into account the presence   of errors in parameter estimates. This procedure can also be applied in the   field of ‘backtesting’ models for Value-at-Risk. Applications to simple AR   and ARCH time series models show that its power in detecting certain   misspecifications is larger than the power of well-known tests for correct   Unconditional Coverage and Conditional Coverage.

Keywords:

Value-at-Risk;   backtest; optimal revision; forecast rationality

JEL:

C12

 

  • A Simple Model for Vast Panels of Volatilities

 

Date:

2011-09

By:

Mattéo   Luciani
  David Veredas

URL:

http://d.repec.org/n?u=RePEc:eca:wpaper:2013/97304&r=for  

Realized   volatilities, when observed through time, share the following stylized facts:   co–movements, clustering, long–memory, dynamic volatility, skewness and   heavy–tails. We propose a simple dynamic factor model that captures these   stylized facts and that can be applied to vast panels of volatilities as it   does not suffer from the curse of dimensionality. It is an enhanced version   of Bai and Ng (2004) in the following respects: i) we allow for long–memory   in both the idiosyncratic and the common components, ii) the common shocks   are conditionally heteroskedastic, and iii) the idiosyncratic and common   shocks are skewed and heavy–tailed. Estimation of the factors, the   idiosyncratic components and the parameters is straightforward: principal   components and low dimension maximum likelihood estimations. A throughout   Monte Carlo study shows the usefulness of the approach and an application to   90 daily realized volatilities, pertaining to S&P100, from January 2001   to December 2008, evinces, among others, the following findings: i) All the   volatilities have long–memory, more than half in the nonstationary range,   that increases during financial turmoil. ii) Tests and criteria point towards   one dynamic common factor driving the co–movements. iii) The factor has   larger long–memory than the assets volatilities, suggesting that long–memory   is a market characteristic. iv) The volatility of the realized volatility is   not constant and common to all. v) A forecasting horse race against   univariate short– and long–memory models and short–memory dynamic factor   models shows that our model outperforms short–, medium–, and long–run   predictions, in particular in periods of stress.

Keywords:

realized   volatilities; vast dimensions; factor models; long-memory; forecasting

JEL:

C32

 

  • Econometric Analysis and Prediction of Recurrent Events

 

Date:

2011-09-19

By:

Adrian   Pagan (School of Economics, University of Sydney)
  Don Harding (School of Economics and Finance, La Trobe University)

URL:

http://d.repec.org/n?u=RePEc:aah:create:2011-33&r=for  

Economic   events such as expansions and recessions in economic activity, bull and bear   markets in stock prices and financial crises have long attracted substantial   interest. In recent times there has been a focus upon predicting the events   and constructing Early Warning Systems of them. Econometric analysis of such   recurrent events is however in its infancy. One can represent the events as a   set of binary indicators. However they are different to the binary random variables   studied in micro-econometrics, being constructed from some (possibly)   continuous data. The lecture discusses what difference this makes to their   econometric analysis. It sets out a framework which deals with how the binary   variables are constructed, what an appropriate estimation procedure would be,   and the implications for the prediction of them. An example based on Turkish   business cycles is used throughout the lecture.

Keywords:

Business   and Financial Cycles, Binary Time Series, BBQ Algorithm

JEL:

C22

 

  • The Analysis of Stochastic Volatility in the Presence of Daily Realised Measures

 

Date:

2011-09-20

By:

Siem Jan   Koopman (VU University Amsterdam)
  Marcel Scharth (VU University Amsterdam)

URL:

http://d.repec.org/n?u=RePEc:dgr:uvatin:20110132&r=for  

We develop   a systematic framework for the joint modelling of returns and multiple daily   realised measures. We assume a linear state space representation for the log   realised measures, which are noisy and biased estimates of the log integrated   variance, at least due to Jensen’s inequality. We incorporate filtering   methods for the estimation of the latent log volatility process. The   endogeneity between daily returns and realised measures leads us to develop a   consistent two-step estimation method for all parameters in our   specification. This method is computationally straightforward even when the   stochastic volatility model contains non-Gaussian return innovations and   leverage effects. The empirical results reveal that measurement errors become   significantly smaller after filtering and that the forecasts from our model   outperforms those from a set of recently developed alternatives.

Keywords:

Kalman   filter; leverage; realised volatility; simulated maximum likelihood

JEL:

C22

 

  • Carbon Tax Scenarios and their Effects on the Irish Energy Sector

 

Date:

2011-09

By:

Di Cosmo,   Valeria
  Hyland, Marie

URL:

http://d.repec.org/n?u=RePEc:esr:wpaper:wp407&r=for  

In this   paper we use annual time series data from 1960 to 2008 to estimate the long   run price and income elasticities underlying energy demand in Ireland. The   Irish economy is divided into five sectors: residential, industrial,   commercial, agricultural and transport, and separate energy demand equations   are estimated for all sectors. Energy demand is broken down by fuel type, and   price and income elasticities are estimated for the primary fuels in the   Irish fuel mix. Using the estimated price and income elasticities we forecast   Irish sectoral energy demand out to 2025. The share of electricity in the   Irish fuel mix is predicted to grow over time, as the share of carbon   intensive fuels such as coal, oil and peat, falls. The share of electricity   in total energy demand grows most in the industrial and commercial sectors,   while oil remains an important fuel in the residential and transport sectors.   Having estimated the baseline forecasts, two different carbon tax scenarios   are imposed and the impact of these scenarios on energy demand, carbon   dioxide emissions, and government revenue is assessed. If it is assumed that   the level of the carbon tax will track the futures price of carbon under the   EU-ETS, the carbon tax will rise from ?21.50 per tonne CO2 in 2012 (the first   year forecasted) to ?41 in 2025. Results show that under this scenario total   emissions would be reduced by approximately 861,000 tonnesof CO2 in 2025   relative to a zero carbon tax scenario, and that such a tax would generate   ?1.1 billion in revenue in the same year. We also examine a high tax scenario   under which emissions reductions and revenue generated will be greater.   Finally, in order to assess the macroeconomic effects of a carbon tax, the   carbon tax scenarios were run in HERMES, the ESRI’s medium-term macroeconomic   model. The results from HERMES show that, a carbon tax of ?41 per tonne CO2   would lead to a 0.21 per cent contraction in GDP, and a 0.08 per cent   reduction in employment. A higher carbon tax would lead to greater   contractions in output.

Keywords:

CO2   emissions/Energy demand/Environmental tax/income distribution

JEL:

Q4

 

Taken from the NEP-FOR mailing list edited by Rob Hyndman.