Gradient boosting with moving-average terms for nonlinear sequential regression
Date
Editor(s)
Advisor
Supervisor
Co-Advisor
Co-Supervisor
Instructor
BUIR Usage Stats
views
downloads
Citation Stats
Series
Abstract
We investigate sequential nonlinear regression and introduce a novel gradient boosting algorithm that exploits the residuals, i.e., prediction errors, as additional features, as inspired by the well-known linear auto-regressive-moving-average (ARMA) models. Our algorithm utilizes the state information from early time steps contained in the residuals to improve the performance in a nonlinear sequential regression/prediction framework. The algorithm exploits the changes in the previous time steps through residual terms between boosting steps by jointly optimizing the model parameters and the feature vectors. For this optimization, we define an iterative procedure in which we alternate between two steps where the former evaluates the optimal base learner parameters for fixed residual values, and the latter updates the residuals given the new parameters. We show through both artificial and well-known real-life competition datasets that our method significantly outperforms the state-of-the-art.