Browsing by Subject "Sequential"
Now showing 1 - 4 of 4
Results Per Page
Sort Options
Item Open Access Bağlam ağaçları ile ardışık doğrusal olmayan bağlanım(IEEE, 2014-04) Vanlı, N. Denizcan; Kozat, Süleyman S.Bu bildiride, ardışık doğrusal olmayan bağlanım problemi incelenmiş ve bağlam ağaçları kullanarak etkili bir öğrenme algoritması sunulmuştur. Bu amaçla, bağlanım alanı parçalara ayrılmış ve oluşan bölgeler bağlam ağacı ile simgelenmiştir. Her bölgede bağımsız bağlanım algoritmaları kullanılarak bağlam ağacı tarafından gösterilebilen tüm doğrusal olmayan modellerin kestirimleri, hesaplama karmaşıklığı bağlam ağacının düğüm sayısıyla doğrusal olan bu algoritma ile uyarlanır olarak birleştirilmiştir. Önerilen algoritmanın performans limitleri, veriler üzerinde istatistiksel varsayımlarda bulunmaksızın incelenmiştir. Ayrıca, teorik sonuçları izah etmek için sayısal bir örnek sunulmuştur.Item Open Access Online classification via self-organizing space partitioning(Institute of Electrical and Electronics Engineers Inc., 2016) Ozkan, H.; Vanli, N. D.; Kozat, S. S.The authors study online supervised learning under the empirical zero-one loss and introduce a novel classification algorithm with strong theoretical guarantees. The proposed method is a highly dynamical self-organizing decision tree structure, which adaptively partitions the feature space into small regions and combines (takes the union of) the local simple classification models specialized in those regions. The authors' approach sequentially and directly minimizes the cumulative loss by jointly learning the optimal feature space partitioning and the corresponding individual partition-region classifiers. They mitigate overtraining issues by using basic linear classifiers at each region while providing a superior modeling power through hierarchical and data adaptive models. The computational complexity of the introduced algorithm scales linearly with the dimensionality of the feature space and the depth of the tree. Their algorithm can be applied to any streaming data without requiring a training phase or a priori information, hence processing data on-the-fly and then discarding it. Therefore, the introduced algorithm is especially suitable for the applications requiring sequential data processing at large scales/high rates. The authors present a comprehensive experimental study in stationary and nonstationary environments. In these experiments, their algorithm is compared with the state-of-the-art methods over the well-known benchmark datasets and shown to be computationally highly superior. The proposed algorithm significantly outperforms the competing methods in the stationary settings and demonstrates remarkable adaptation capabilities to nonstationarity in the presence of drifting concepts and abrupt/sudden concept changes. © 1991-2012 IEEE.Item Open Access Piecewise nonlinear regression via decision adaptive trees(IEEE, 2014-09) Vanlı, N. Denizcan; Sayın, Muhammed O.; Ergüt, S.; Kozat, Süleyman S.We investigate the problem of adaptive nonlinear regression and introduce tree based piecewise linear regression algorithms that are highly efficient and provide significantly improved performance with guaranteed upper bounds in an individual sequence manner. We partition the regressor space using hyperplanes in a nested structure according to the notion of a tree. In this manner, we introduce an adaptive nonlinear regression algorithm that not only adapts the regressor of each partition but also learns the complete tree structure with a computational complexity only polynomial in the number of nodes of the tree. Our algorithm is constructed to directly minimize the final regression error without introducing any ad-hoc parameters. Moreover, our method can be readily incorporated with any tree construction method as demonstrated in the paper. © 2014 EURASIP.Item Open Access Twice-universal piecewise linear regression via infinite depth context trees(IEEE, 2015) Vanlı, Nuri Denizcan; Sayın, Muhammed O.; Göze, T.; Kozat, Süleyman SelimWe investigate the problem of sequential piecewise linear regression from a competitive framework. For an arbitrary and unknown data length n, we first introduce a method to partition the regressor space. Particularly, we present a recursive method that divides the regressor space into O(n) disjoint regions that can result in approximately 1.5n different piecewise linear models on the regressor space. For each region, we introduce a universal linear regressor whose performance is nearly as well as the best linear regressor whose parameters are set non-causally. We then use an infinite depth context tree to represent all piecewise linear models and introduce a universal algorithm to achieve the performance of the best piecewise linear model that can be selected in hindsight. In this sense, the introduced algorithm is twice-universal such that it sequentially achieves the performance of the best model that uses the optimal regression parameters. Our algorithm achieves this performance only with a computational complexity upper bounded by O(n) in the worst-case and O(log(n)) under certain regularity conditions. We provide the explicit description of the algorithm as well as the upper bounds on the regret with respect to the best nonlinear and piecewise linear models, and demonstrate the performance of the algorithm through simulations.