Browsing by Subject "Newton method"
Now showing 1 - 2 of 2
- Results Per Page
- Sort Options
Item Open Access Highly efficient hierarchical online nonlinear regression using second order methods(Elsevier B.V., 2017) Civek, B. C.; Delibalta, I.; Kozat, S. S.We introduce highly efficient online nonlinear regression algorithms that are suitable for real life applications. We process the data in a truly online manner such that no storage is needed, i.e., the data is discarded after being used. For nonlinear modeling we use a hierarchical piecewise linear approach based on the notion of decision trees where the space of the regressor vectors is adaptively partitioned based on the performance. As the first time in the literature, we learn both the piecewise linear partitioning of the regressor space as well as the linear models in each region using highly effective second order methods, i.e., Newton–Raphson Methods. Hence, we avoid the well known over fitting issues by using piecewise linear models, however, since both the region boundaries as well as the linear models in each region are trained using the second order methods, we achieve substantial performance compared to the state of the art. We demonstrate our gains over the well known benchmark data sets and provide performance results in an individual sequence manner guaranteed to hold without any statistical assumptions. Hence, the introduced algorithms address computational complexity issues widely encountered in real life applications while providing superior guaranteed performance in a strong deterministic sense.Item Open Access Piecewise linear regression based on adaptive tree structure using second order methods(IEEE, 2016) Civek, Burak Cevat; Delibalta, İ.; Kozat, Süleyman SerdarWe introduce a highly efficient online nonlinear regression algorithm. We process the data in a truly online manner such that no storage is needed, i.e., the data is discarded after used. For nonlinear modeling we use a hierarchical piecewise linear approach based on the notion of decision trees, where the regressor space is adaptively partitioned based directly on the performance. As the first time in the literature, we learn both the piecewise linear partitioning of the regressor space as well as the linear models in each region using highly effective second order methods, i.e., Newton-Raphson Methods. Hence, we avoid the well known over fitting issues and achieve substantial performance compared to the state of the art. We demonstrate our gains over the well known benchmark data sets and provide performance results in an individual sequence manner guaranteed to hold without any statistical assumptions.