Vanlı, Nuri DenizcanSayın, Muhammed O.Göze, T.Kozat, Süleyman Selim2016-02-082016-02-0820151520-6149http://hdl.handle.net/11693/27963Date of Conference: 19-24 April 2015Conference Name: International Conference on Acoustics, Speech, and Signal Processing, IEEE 2015We investigate the problem of sequential piecewise linear regression from a competitive framework. For an arbitrary and unknown data length n, we first introduce a method to partition the regressor space. Particularly, we present a recursive method that divides the regressor space into O(n) disjoint regions that can result in approximately 1.5n different piecewise linear models on the regressor space. For each region, we introduce a universal linear regressor whose performance is nearly as well as the best linear regressor whose parameters are set non-causally. We then use an infinite depth context tree to represent all piecewise linear models and introduce a universal algorithm to achieve the performance of the best piecewise linear model that can be selected in hindsight. In this sense, the introduced algorithm is twice-universal such that it sequentially achieves the performance of the best model that uses the optimal regression parameters. Our algorithm achieves this performance only with a computational complexity upper bounded by O(n) in the worst-case and O(log(n)) under certain regularity conditions. We provide the explicit description of the algorithm as well as the upper bounds on the regret with respect to the best nonlinear and piecewise linear models, and demonstrate the performance of the algorithm through simulations.EnglishInfinite depth context treeNonlinearPiecewise linearRegressionSequentialTwice-universal piecewise linear regression via infinite depth context treesConference Paper10.1109/ICASSP.2015.7178331