Gradient boosting with moving-average terms for nonlinear sequential regression

Series

Abstract

We investigate sequential nonlinear regression and introduce a novel gradient boosting algorithm that exploits the residuals, i.e., prediction errors, as additional features, as inspired by the well-known linear auto-regressive-moving-average (ARMA) models. Our algorithm utilizes the state information from early time steps contained in the residuals to improve the performance in a nonlinear sequential regression/prediction framework. The algorithm exploits the changes in the previous time steps through residual terms between boosting steps by jointly optimizing the model parameters and the feature vectors. For this optimization, we define an iterative procedure in which we alternate between two steps where the former evaluates the optimal base learner parameters for fixed residual values, and the latter updates the residuals given the new parameters. We show through both artificial and well-known real-life competition datasets that our method significantly outperforms the state-of-the-art.

Source Title

IEEE Signal Processing Letters

Publisher

Institute of Electrical and Electronics Engineers

Course

Other identifiers

Book Title

Degree Discipline

Degree Level

Degree Name

Citation

Published Version (Please cite this version)

Language

en