• About
  • Policies
  • What is open access
  • Library
  • Contact
Advanced search
      View Item 
      •   BUIR Home
      • Scholarly Publications
      • Faculty of Engineering
      • Department of Computer Engineering
      • View Item
      •   BUIR Home
      • Scholarly Publications
      • Faculty of Engineering
      • Department of Computer Engineering
      • View Item
      JavaScript is disabled for your browser. Some features of this site may not work without it.

      Less is more: a comprehensive framework for the number of components of ensemble classifiers

      Thumbnail
      View / Download
      1.6 Mb
      Author(s)
      Bonab, H.
      Can, Fazlı
      Date
      2019
      Source Title
      IEEE Transactions on Neural Networks and Learning Systems
      Print ISSN
      2162-237X
      Publisher
      IEEE
      Volume
      30
      Issue
      9
      Pages
      2735 - 2745
      Language
      English
      Type
      Article
      Item Usage Stats
      113
      views
      230
      downloads
      Abstract
      The number of component classifiers chosen for an ensemble greatly impacts the prediction ability. In this paper, we use a geometric framework for a priori determining the ensemble size, which is applicable to most of the existing batch and online ensemble classifiers. There are only a limited number of studies on the ensemble size examining majority voting (MV) and weighted MV (WMV). Almost all of them are designed for batch-mode, hardly addressing online environments. Big data dimensions and resource limitations, in terms of time and memory, make the determination of ensemble size crucial, especially for online environments. For the MV aggregation rule, our framework proves that the more strong components we add to the ensemble, the more accurate predictions we can achieve. For the WMV aggregation rule, our framework proves the existence of an ideal number of components, which is equal to the number of class labels, with the premise that components are completely independent of each other and strong enough. While giving the exact definition for a strong and independent classifier in the context of an ensemble is a challenging task, our proposed geometric framework provides a theoretical explanation of diversity and its impact on the accuracy of predictions. We conduct a series of experimental evaluations to show the practical value of our theorems and existing challenges.
      Keywords
      Data stream
      Ensemble cardinality
      Ensemble size
      Law of diminishing returns
      Majority voting (MV)
      Supervised learning
      Voting framework
      Weighted MV (WMV)
      Permalink
      http://hdl.handle.net/11693/75936
      Published Version (Please cite this version)
      https://dx.doi.org/10.1109/TNNLS.2018.2886341
      Collections
      • Department of Computer Engineering 1561
      Show full item record

      Browse

      All of BUIRCommunities & CollectionsTitlesAuthorsAdvisorsBy Issue DateKeywordsTypeDepartmentsCoursesThis CollectionTitlesAuthorsAdvisorsBy Issue DateKeywordsTypeDepartmentsCourses

      My Account

      Login

      Statistics

      View Usage StatisticsView Google Analytics Statistics

      Bilkent University

      If you have trouble accessing this page and need to request an alternate format, contact the site administrator. Phone: (312) 290 2976
      © Bilkent University - Library IT

      Contact Us | Send Feedback | Off-Campus Access | Admin | Privacy