Show simple item record

dc.contributor.advisorKozat, Süleyman Serdar
dc.contributor.authorKari, Dariush
dc.date.accessioned2017-07-17T13:08:15Z
dc.date.available2017-07-17T13:08:15Z
dc.date.copyright2017-07
dc.date.issued2017-07
dc.date.submitted2017-07-12
dc.identifier.urihttp://hdl.handle.net/11693/33398
dc.descriptionCataloged from PDF version of article.en_US
dc.descriptionThesis (M.S.): Bilkent University, Department of Electrical and Electronics Engineering, İhsan Doğramacı Bilkent University, 2017.en_US
dc.descriptionIncludes bibliographical references (leaves 51-57.en_US
dc.description.abstractWe investigate boosted online regression and propose a novel family of regression algorithms with strong theoretical bounds. In addition, we implement several variants of the proposed generic algorithm. We specifically provide theoretical bounds for the performance of our proposed algorithms that hold in a strong mathematical sense. We achieve guaranteed performance improvement over the conventional online regression methods without any statistical assumptions on the desired data or feature vectors. We demonstrate an intrinsic relationship, in terms of boosting, between the adaptive mixture-of-experts and data reuse algorithms. Furthermore, we introduce a boosting algorithm based on random updates that is significantly faster than the conventional boosting methods and other variants of our proposed algorithms while achieving an enhanced performance gain. Hence, the random updates method is specifically applicable to the fast and high dimensional streaming data. Specifically, we investigate Recursive Least Squares (RLS)-based and Least Mean Squares (LMS)-based linear regression algorithms in a mixture-of-experts setting, and provide several variants of these well known adaptation methods. Moreover, we extend the proposed algorithms to other filters. Specifically, we investigate the effect of the proposed algorithms on piecewise linear filters. Furthermore, we provide theoretical bounds for the computational complexity of our proposed algorithms. We demonstrate substantial performance gains in terms of mean square error over the constituent filters through an extensive set of benchmark real data sets and simulated examples.en_US
dc.description.statementofresponsibilityby Dariush Kari.en_US
dc.format.extentxi, 59 leaves : charts ; 29 cmen_US
dc.language.isoEnglishen_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.subjectOnline boostingen_US
dc.subjectOnline regressionen_US
dc.subjectBoosted regressionen_US
dc.subjectEnsemble learningen_US
dc.subjectSmooth boosten_US
dc.subjectMixture methodsen_US
dc.titleBoosted adaptive filtersen_US
dc.title.alternativeİyileştirilmiş uyarlanır süzgeçleren_US
dc.typeThesisen_US
dc.departmentDepartment of Electrical and Electronics Engineeringen_US
dc.publisherBilkent Universityen_US
dc.description.degreeM.S.en_US
dc.identifier.itemidB155999


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record