Vanlı, N.Sayın, M.Neyshabouri, Mohammadreza MohagheghÖzkan, H.Kozat, Süleyman S.2020-02-122020-02-1220190031-3203http://hdl.handle.net/11693/53312We study sequential nonlinear regression and introduce an online algorithm that elegantly mitigates, via an adaptively incremental hierarchical structure, convergence and undertraining issues of conventional nonlinear regression methods. Particularly, we present a piecewise linear (or nonlinear) regression algorithm that partitions the regressor space and learns a linear model at each region to combine. Unlike the conventional approaches, our algorithm effectively learns the optimal regressor space partition with the desired complexity in a completely sequential and data driven manner. Our algorithm sequentially and asymptotically achieves the performance of the optimal twice differentiable regression function for any data sequence without any statistical assumptions. The introduced algorithm can be efficiently implemented with a computational complexity that is only logarithmic in the length of data. In our experiments, we demonstrate significant gains for the well-known benchmark real data sets when compared to the state-of-the-art techniques.EnglishOnline regressionSequential learningNonlinear modelsIncremental decision treesNonlinear regression via incremental decision treesArticle10.1016/j.patcog.2018.08.014