Nonlinear regression via incremental decision trees
buir.contributor.author | Neyshabouri, Mohammadreza Mohaghegh | |
buir.contributor.author | Kozat, Süleyman S. | |
dc.citation.epage | 13 | en_US |
dc.citation.spage | 1 | en_US |
dc.citation.volumeNumber | 86 | en_US |
dc.contributor.author | Vanlı, N. | en_US |
dc.contributor.author | Sayın, M. | en_US |
dc.contributor.author | Neyshabouri, Mohammadreza Mohaghegh | en_US |
dc.contributor.author | Özkan, H. | en_US |
dc.contributor.author | Kozat, Süleyman S. | en_US |
dc.date.accessioned | 2020-02-12T11:57:18Z | |
dc.date.available | 2020-02-12T11:57:18Z | |
dc.date.issued | 2019 | |
dc.department | Department of Electrical and Electronics Engineering | en_US |
dc.description.abstract | We study sequential nonlinear regression and introduce an online algorithm that elegantly mitigates, via an adaptively incremental hierarchical structure, convergence and undertraining issues of conventional nonlinear regression methods. Particularly, we present a piecewise linear (or nonlinear) regression algorithm that partitions the regressor space and learns a linear model at each region to combine. Unlike the conventional approaches, our algorithm effectively learns the optimal regressor space partition with the desired complexity in a completely sequential and data driven manner. Our algorithm sequentially and asymptotically achieves the performance of the optimal twice differentiable regression function for any data sequence without any statistical assumptions. The introduced algorithm can be efficiently implemented with a computational complexity that is only logarithmic in the length of data. In our experiments, we demonstrate significant gains for the well-known benchmark real data sets when compared to the state-of-the-art techniques. | en_US |
dc.description.provenance | Submitted by Onur Emek (onur.emek@bilkent.edu.tr) on 2020-02-12T11:57:18Z No. of bitstreams: 1 Bilkent-research-paper.pdf: 268963 bytes, checksum: ad2e3a30c8172b573b9662390ed2d3cf (MD5) | en |
dc.description.provenance | Made available in DSpace on 2020-02-12T11:57:18Z (GMT). No. of bitstreams: 1 Bilkent-research-paper.pdf: 268963 bytes, checksum: ad2e3a30c8172b573b9662390ed2d3cf (MD5) Previous issue date: 2018 | en |
dc.embargo.release | 2022-02-01 | |
dc.identifier.doi | 10.1016/j.patcog.2018.08.014 | en_US |
dc.identifier.issn | 0031-3203 | |
dc.identifier.uri | http://hdl.handle.net/11693/53312 | |
dc.language.iso | English | en_US |
dc.publisher | Elsevier | en_US |
dc.relation.isversionof | https://doi.org/10.1016/j.patcog.2018.08.014 | en_US |
dc.source.title | Pattern Recognition | en_US |
dc.subject | Online regression | en_US |
dc.subject | Sequential learning | en_US |
dc.subject | Nonlinear models | en_US |
dc.subject | Incremental decision trees | en_US |
dc.title | Nonlinear regression via incremental decision trees | en_US |
dc.type | Article | en_US |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- Nonlinear_regression_via_incremental_decision_trees.pdf
- Size:
- 1.35 MB
- Format:
- Adobe Portable Document Format
- Description:
License bundle
1 - 1 of 1
No Thumbnail Available
- Name:
- license.txt
- Size:
- 1.71 KB
- Format:
- Item-specific license agreed upon to submission
- Description: