Nonlinear regression via incremental decision trees

buir.contributor.authorNeyshabouri, Mohammadreza Mohaghegh
buir.contributor.authorKozat, Süleyman S.
dc.citation.epage13en_US
dc.citation.spage1en_US
dc.citation.volumeNumber86en_US
dc.contributor.authorVanlı, N.en_US
dc.contributor.authorSayın, M.en_US
dc.contributor.authorNeyshabouri, Mohammadreza Mohagheghen_US
dc.contributor.authorÖzkan, H.en_US
dc.contributor.authorKozat, Süleyman S.en_US
dc.date.accessioned2020-02-12T11:57:18Z
dc.date.available2020-02-12T11:57:18Z
dc.date.issued2019
dc.departmentDepartment of Electrical and Electronics Engineeringen_US
dc.description.abstractWe study sequential nonlinear regression and introduce an online algorithm that elegantly mitigates, via an adaptively incremental hierarchical structure, convergence and undertraining issues of conventional nonlinear regression methods. Particularly, we present a piecewise linear (or nonlinear) regression algorithm that partitions the regressor space and learns a linear model at each region to combine. Unlike the conventional approaches, our algorithm effectively learns the optimal regressor space partition with the desired complexity in a completely sequential and data driven manner. Our algorithm sequentially and asymptotically achieves the performance of the optimal twice differentiable regression function for any data sequence without any statistical assumptions. The introduced algorithm can be efficiently implemented with a computational complexity that is only logarithmic in the length of data. In our experiments, we demonstrate significant gains for the well-known benchmark real data sets when compared to the state-of-the-art techniques.en_US
dc.description.provenanceSubmitted by Onur Emek (onur.emek@bilkent.edu.tr) on 2020-02-12T11:57:18Z No. of bitstreams: 1 Bilkent-research-paper.pdf: 268963 bytes, checksum: ad2e3a30c8172b573b9662390ed2d3cf (MD5)en
dc.description.provenanceMade available in DSpace on 2020-02-12T11:57:18Z (GMT). No. of bitstreams: 1 Bilkent-research-paper.pdf: 268963 bytes, checksum: ad2e3a30c8172b573b9662390ed2d3cf (MD5) Previous issue date: 2018en
dc.embargo.release2022-02-01
dc.identifier.doi10.1016/j.patcog.2018.08.014en_US
dc.identifier.issn0031-3203
dc.identifier.urihttp://hdl.handle.net/11693/53312
dc.language.isoEnglishen_US
dc.publisherElsevieren_US
dc.relation.isversionofhttps://doi.org/10.1016/j.patcog.2018.08.014en_US
dc.source.titlePattern Recognitionen_US
dc.subjectOnline regressionen_US
dc.subjectSequential learningen_US
dc.subjectNonlinear modelsen_US
dc.subjectIncremental decision treesen_US
dc.titleNonlinear regression via incremental decision treesen_US
dc.typeArticleen_US

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Nonlinear_regression_via_incremental_decision_trees.pdf
Size:
1.35 MB
Format:
Adobe Portable Document Format
Description:

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description: