New Forecasting Papers 2011-03-29

In this issue we have Improving forecasting performance by window and model averaging, Does Money Help Predict Inflation? Modelling the Currency in Circulation for the State of Qatar, Are Forecast Updates Progressive? Forecasting the Term Structure of Interest Rates Using Integrated Nested Laplace Approximations.

  1. Improving forecasting performance by window and model averaging
    Date: 2011-02-21
    By: Prasad S Bhattacharya
    Dimitrios D Thomakos
    This study presents extensive results on the benefits of rolling window and model averaging. Building on the recent work on rolling window averaging by Pesaran et al (2010, 2009) and on exchange rate forecasting by Molodtsova and Papell (2009), we explore whether rolling window averaging can be considered beneficial on a priori grounds. We investigate whether rolling window averaging can improve the performance of model averaging, especially when ‘simpler’ models are used. The analysis provides strong support for rolling window averaging, outperforming the best window forecasts more than 50% of the time across all rolling windows. Furthermore, rolling window averaging smoothes out the forecast path, improves robustness, and minimizes the pitfalls associated with potential structural breaks.
    Keywords: Exchange rate forecasting, inflation forecasting, output growth forecasting, rolling window, model averaging, short horizon, robustness.
    JEL: C22
  2. Does Money Help Predict Inflation? An Empirical Assessment for Central Europe
    Date: 2010-12
    By: Roman Horvath
    Lubos Komarek
    Filip Rozsypal
    This paper investigates the predictive ability of money for future inflation in the Czech Republic, Hungary, Poland, and Slovakia. We construct monetary indicators similar to those the ECB regularly uses for monetary analysis. We find some in-sample evidence that money matters for future inflation at the policy horizons that central banks typically focus on, but our pseudo out-of-sample forecasting exercise shows that money does not in general improve the inflation forecasts vis-à-vis some benchmark models, such as the autoregressive process. Since at least some models containing money improve the inflation forecasts in certain periods, we argue that money still serves as a useful cross-check for monetary policy analysis.
    Keywords: Central Europe, forecasting, inflation, money.
    JEL: E41
  3. Modelling the Currency in Circulation for the State of Qatar.
    Date: 2010-01-15
    By: Balli, Faruk
    Elsamadisy, Elsayed
    The main concern of this report is to model the daily and weekly forecasting of the currency in circulation (CIC) for the State of Qatar. The time series of daily observations of the CIC is expected to display marked seasonal and cyclical patterns daily, weekly or even monthly basis. We have compared the forecasting performance of typical linear forecasting models, namely the regression model and the seasonal ARIMA model using daily data. We found that seasonal ARIMA model performs better in forecasting CIC, particularly for short-term horizons.
    Keywords: Currency in Circulation; Forecasting; Seasonal ARIMA
    JEL: C32
  4. Are Forecast Updates Progressive?
    Date: 2011-03
    By: Chia-Lin Chang (Department of Applied Economics, Department of Finance, National Chung Hsing University)
    Philip Hans Franses (Econometric Institute, Erasmus School of Economics, Erasmus University Rotterdam)
    Michael McAleer (Erasmus University Rotterdam, Tinbergen Institute, The Netherlands, and Institute of Economic Research, Kyoto University)
    Many macro-economic forecasts and forecast updates, such as those from the IMF and OECD, typically involve both a model component, which is replicable, as well as intuition (namely, expert knowledge possessed by a forecaster), which is non-replicable. . Learning from previous mistakes can affect both the replicable component of a model as well as intuition. If learning, and hence forecast updates, are progressive, forecast updates should generally become more accurate as the actual value is approached. Otherwise, learning and forecast updates would be neutral. The paper proposes a methodology to test whether macro-economic forecast updates are progressive, where the interaction between model and intuition is explicitly taken into account. The data set for the empirical analysis is for Taiwan, where we have three decades of quarterly data available of forecasts and their updates of two economic fundamentals, namely the inflation rate and real GDP growth rate. The empirical results suggest that the forecast updates for Taiwan are progressive, and that progress can be explained predominantly by improved intuition.
    Keywords: Macro-economic forecasts, econometric models, intuition, learning, progressive forecast updates, forecast errors.
    JEL: C53
  5. Forecasting the Term Structure of Interest Rates Using Integrated Nested Laplace Approximations
    Date: 2011-03-14
    By: Márcio Laurini (IBMEC Business School)
    Luiz Koodi Hotta (IMECC-Unicamp)
    This article discuss the use of Bayesian methods for inference and forecasting in dynamic term structure models through Integrated Nested Laplace Approximations (INLA). This method of analytical approximations allows for accurate inferences for latent factors, parameters and forecasts in dynamic models with reduced computational cost. In the estimation of dynamic term structure models it also avoids some simplifications in the inference procedures, as the estimation in two stages. The results obtained in the estimation of the dynamic Nelson-Siegel model indicate that this methodology performs more accurate out-of-sample forecasts compared to the methods of two-stage estimation by OLS and also Bayesian estimation methods using MCMC. These analytical approaches also allow calculating efficiently measures of model selection such as generalized cross validation and marginal likelihood, that may be computationally prohibitive in MCMC estimations.
    Keywords: Term Structure, Latent Factors, Bayesian Forecasting, Laplace Approximations
    JEL: C11
  6. Markov-Switching MIDAS Models
    Date: 2011
    By: Pierre Guerin
    Massimiliano Marcellino
    This paper introduces a new regression model – Markov-switching mixed data sampling (MS-MIDAS) – that incorporates regime changes in the parameters of the mixed data sampling (MIDAS) models and allows for the use of mixed-frequency data in Markov-switching models. After a discussion of estimation and inference for MS-MIDAS, and a small sample simulation based evaluation, the MS-MIDAS model is applied to the prediction of the US and UK economic activity, in terms both of quantitative forecasts of the aggregate economic activity and of the prediction of the business cycle regimes. Both simulation and empirical results indicate that MSMIDAS is a very useful specification.
    Keywords: Business cycle, Mixed-frequency data, Non-linear models, Forecasting, Nowcasting
    JEL: C22
  7. Risk Management of Risk under the Basel Accord: Forecasting Value-at-Risk of VIX Futures
    Date: 2011-03
    By: Chia-Lin Chang (Department of Applied Economics, Department of Finance, National Chung Hsing University)
    Juan-Ángel Jiménez-Martín (Department of Quantitative Economics, Complutense University of Madrid)
    Michael McAleer (Erasmus University Rotterdam, Tinbergen Institute, The Netherlands, and Institute of Economic Research, Kyoto University)
    Teodosio Pérez-Amaral (Department of Quantitative Economics, Complutense University of Madrid)
    The Basel II Accord requires that banks and other Authorized Deposit-taking Institutions (ADIs) communicate their daily risk forecasts to the appropriate monetary authorities at the beginning of each trading day, using one or more risk models to measure Value-at-Risk (VaR). The risk estimates of these models are used to determine capital requirements and associated capital costs of ADIs, depending in part on the number of previous violations, whereby realised losses exceed the estimated VaR. McAleer, Jimenez-Martin and Perez- Amaral (2009) proposed a new approach to model selection for predicting VaR, consisting of combining alternative risk models, and comparing conservative and aggressive strategies for choosing between VaR models. This paper addresses the question of risk management of risk, namely VaR of VIX futures prices. We examine how different risk management strategies performed during the 2008-09 global financial crisis (GFC). We find that an aggressive strategy of choosing the Supremum of the single model forecasts is preferred to the other alternatives, and is robust during the GFC. However, this strategy implies relatively high numbers of violations and accumulated losses, though these are admissible under the Basel II Accord.
    Keywords: Median strategy, Value-at-Risk (VaR), daily capital charges, violation penalties, optimizing strategy, aggressive risk management, conservative risk management, Basel II Accord, VIX futures, global financial crisis (GFC).
    JEL: G32
  8. To Aggregate or Not to Aggregate: Should decisions and models have the same frequency?
    Date: 2010-12-15
    By: Kiygi Calli, M.
    Weverbergh, M.
    Franses, Ph.H.B.F.
    We examine the situation where hourly data are available to design advertising-response models, whereas managerial decision making can concern hourly, daily or weekly intervals. The key question is how models for hourly data compare to models based on weekly data with respect to forecasting accuracy and with respect to assessing advertising impact. Simulation experiments suggest that the strategy, which entails modeling the least aggregated data and forecasting more aggregate data, yields better forecasts, provided that one has a correct model specification for the higher frequency data. A detailed analysis of three actual data sets confirms this conclusion. A key feature of this confirmation is that aggregation affects data transformation to dampen the variance. The estimated advertising impact is sensitive to the appropriate transformation. Our conclusion is that disaggregated models are preferable also when decision have to be made at lower frequencies.
    Keywords: advertising effectiveness;advertising response;aggregation;normative and predictive validity
  9. Modelling and Forecasting with County Court Data: Regional Mortgage Possession Claims and Orders in England and Wales
    Date: 2011-02
    By: Janine Aron
    John Muellbauer
    This paper presents new quarterly panel data models for county court claims and orders for mortgage possession for seven regions of England plus Wales. Different types of data on mortgage possessions are compared. The innovations include the treatment of difficult to observe variations in loan quality and shifts in forbearance policy by lenders, by common indicators based on dummy variables.
  10. Extracting deflation probability forecasts from Treasury yields
    Date: 2011
    By: Jens H. E. Christensen
    Jose A. Lopez
    Glenn D. Rudebusch
    We construct probability forecasts for episodes of price deflation (i.e., a falling price level) using yields on nominal and real U.S. Treasury bonds. The deflation probability forecasts identify two “deflation scares” during the past decade: a mild one following the 2001 recession, and a more serious one starting in late 2008 with the deepening of the financial crisis. The estimated deflation probabilities are generally consistent with those from macroeconomic models and surveys of professional forecasters, but they also provide highfrequency insight into the views of financial market participants. The probabilities can also be used to price the deflation option embedded in real Treasury bonds.
    Keywords: Deflation (Finance)
  11. Global Temperature Trends
    Date: 2011-03
    By: Trevor Breusch
    Farshid Vahid
    Are global temperatures on a warming trend? It is difficult to be certain about trends when there is so much variation in the data and very high correlation from year to year. We investigate the question using statistical time series methods. Our analysis shows that the upward movement over the last 130-160 years is persistent and not explained by the high correlation, so it is best described as a trend. The warming trend becomes steeper after the mid-1970s, but there is no significant evidence for a break in trend in the late 1990s. Viewed from the perspective of 30 or 50 years ago, the temperatures recorded in most of the last decade lie above the confidence band of forecasts produced by a model that does not allow for a warming trend.
    Keywords: Land and ocean temperatures; deterministic and stochastic trends; persistence; piecewise linear trends
    JEL: C2
  12. Policymakers' Votes and Predictability of Monetary Policy
    Date: 2011
    By: Andrei Sirchenko
    The National Bank of Poland does not publish the Monetary Policy Council's voting records before the subsequent policy meeting. Using real-time data, this paper shows that a prompter release of the voting records could improve the predictability of policy decisions. The voting patterns reveal strong and robust predictive content even after controlling for policy bias and responses to in.ation, real activity, exchange rates and financial market information. They contain information not embedded in the spreads and moves in the market interest rates, nor in the explicit forecasts of the next policy decision made by market analysts in Reuters surveys. Moreover, the direction of policymakers' dissent explains the direction of analysts.forecast bias. These findings are based on the voting patterns only, without the knowledge of policymakers' names.
    Keywords: monetary policy; predictability; policy interest rate; voting records; real-time data
    JEL: D70

Taken from the NEP-FOR mailing list edited by Rob Hyndman.