Twice-universal piecewise linear regression via infinite depth context trees

dc.citation.epage2055en_US
dc.citation.spage2051en_US
dc.contributor.authorVanlı, Nuri Denizcanen_US
dc.contributor.authorSayın, Muhammed O.en_US
dc.contributor.authorGöze, T.en_US
dc.contributor.authorKozat, Süleyman Selimen_US
dc.coverage.spatialBrisbane, QLD, Australiaen_US
dc.date.accessioned2016-02-08T12:06:49Z
dc.date.available2016-02-08T12:06:49Z
dc.date.issued2015en_US
dc.departmentDepartment of Electrical and Electronics Engineeringen_US
dc.descriptionDate of Conference: 19-24 April 2015en_US
dc.descriptionConference Name: International Conference on Acoustics, Speech, and Signal Processing, IEEE 2015
dc.description.abstractWe investigate the problem of sequential piecewise linear regression from a competitive framework. For an arbitrary and unknown data length n, we first introduce a method to partition the regressor space. Particularly, we present a recursive method that divides the regressor space into O(n) disjoint regions that can result in approximately 1.5n different piecewise linear models on the regressor space. For each region, we introduce a universal linear regressor whose performance is nearly as well as the best linear regressor whose parameters are set non-causally. We then use an infinite depth context tree to represent all piecewise linear models and introduce a universal algorithm to achieve the performance of the best piecewise linear model that can be selected in hindsight. In this sense, the introduced algorithm is twice-universal such that it sequentially achieves the performance of the best model that uses the optimal regression parameters. Our algorithm achieves this performance only with a computational complexity upper bounded by O(n) in the worst-case and O(log(n)) under certain regularity conditions. We provide the explicit description of the algorithm as well as the upper bounds on the regret with respect to the best nonlinear and piecewise linear models, and demonstrate the performance of the algorithm through simulations.en_US
dc.identifier.doi10.1109/ICASSP.2015.7178331en_US
dc.identifier.issn1520-6149en_US
dc.identifier.urihttp://hdl.handle.net/11693/27963
dc.language.isoEnglishen_US
dc.publisherIEEEen_US
dc.relation.isversionofhttp://dx.doi.org/10.1109/ICASSP.2015.7178331en_US
dc.source.titleProceedings of the International Conference on Acoustics, Speech, and Signal Processing, IEEE 2015en_US
dc.subjectInfinite depth context treeen_US
dc.subjectNonlinearen_US
dc.subjectPiecewise linearen_US
dc.subjectRegressionen_US
dc.subjectSequentialen_US
dc.titleTwice-universal piecewise linear regression via infinite depth context treesen_US
dc.typeConference Paperen_US

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Twice-universal piecewise linear regression via infinite depth context trees.pdf
Size:
582.3 KB
Format:
Adobe Portable Document Format
Description:
Full printable version