Prediction of time series by statistical learning: general losses and fast rates.

Authors
Publication date
2013
Publication type
Journal Article
Summary We establish rates of convergences in time series forecasting using the statistical learning approach based on oracle inequalities. A series of papers extends the oracle inequalities obtained for iid observations to time series under weak dependence conditions. Given a family of predictors and $n$ observations, oracle inequalities state that a predictor forecasts the series as well as the best predictor in the family up to a remainder term $\Delta_n$. Using the PAC-Bayesian approach, we establish under weak dependence conditions oracle inequalities with optimal rates of convergence. We extend previous results for the absolute loss function to any Lipschitz loss function with rates $\Delta_n\sim\sqrt{ c(\Theta)/ n}$ where $c(\Theta)$ measures the complexity of the model. We apply the method for quantile loss functions to forecast the french GDP. Under additional conditions on the loss functions (satisfied by the quadratic loss function) and on the time series, we refine the rates of convergence to $\Delta_n \sim c(\Theta)/n$. We achieve for the first time these fast rates for uniformly mixing processes. These rates are known to be optimal in the iid case and for individual sequences. In particular, we generalize the results of Dalalyan and Tsybakov on sparse regression estimation to the case of autoregression.
Publisher
Walter de Gruyter GmbH
Topics of the publication
Themes detected by scanR from retrieved publications. For more information, see https://scanr.enseignementsup-recherche.gouv.fr