Less is more: a comprehensive framework for the number of components of ensemble classifiers
buir.contributor.author | Can, Fazlı | |
dc.citation.epage | 2745 | en_US |
dc.citation.issueNumber | 9 | en_US |
dc.citation.spage | 2735 | en_US |
dc.citation.volumeNumber | 30 | en_US |
dc.contributor.author | Bonab, H. | en_US |
dc.contributor.author | Can, Fazlı | en_US |
dc.date.accessioned | 2021-03-16T08:52:33Z | |
dc.date.available | 2021-03-16T08:52:33Z | |
dc.date.issued | 2019 | |
dc.department | Department of Computer Engineering | en_US |
dc.description.abstract | The number of component classifiers chosen for an ensemble greatly impacts the prediction ability. In this paper, we use a geometric framework for a priori determining the ensemble size, which is applicable to most of the existing batch and online ensemble classifiers. There are only a limited number of studies on the ensemble size examining majority voting (MV) and weighted MV (WMV). Almost all of them are designed for batch-mode, hardly addressing online environments. Big data dimensions and resource limitations, in terms of time and memory, make the determination of ensemble size crucial, especially for online environments. For the MV aggregation rule, our framework proves that the more strong components we add to the ensemble, the more accurate predictions we can achieve. For the WMV aggregation rule, our framework proves the existence of an ideal number of components, which is equal to the number of class labels, with the premise that components are completely independent of each other and strong enough. While giving the exact definition for a strong and independent classifier in the context of an ensemble is a challenging task, our proposed geometric framework provides a theoretical explanation of diversity and its impact on the accuracy of predictions. We conduct a series of experimental evaluations to show the practical value of our theorems and existing challenges. | en_US |
dc.description.provenance | Submitted by Onur Emek (onur.emek@bilkent.edu.tr) on 2021-03-16T08:52:33Z No. of bitstreams: 1 Less_Is_More_A_Comprehensive_Framework_for_the_Number_of_Components_of_Ensemble_Classifiers.pdf: 1717693 bytes, checksum: d2ba7bd3fdafbfc42c99c8744d1e0a06 (MD5) | en |
dc.description.provenance | Made available in DSpace on 2021-03-16T08:52:33Z (GMT). No. of bitstreams: 1 Less_Is_More_A_Comprehensive_Framework_for_the_Number_of_Components_of_Ensemble_Classifiers.pdf: 1717693 bytes, checksum: d2ba7bd3fdafbfc42c99c8744d1e0a06 (MD5) Previous issue date: 2019 | en |
dc.identifier.doi | 10.1109/TNNLS.2018.2886341 | en_US |
dc.identifier.issn | 2162-237X | |
dc.identifier.uri | http://hdl.handle.net/11693/75936 | |
dc.language.iso | English | en_US |
dc.publisher | IEEE | en_US |
dc.relation.isversionof | https://dx.doi.org/10.1109/TNNLS.2018.2886341 | en_US |
dc.source.title | IEEE Transactions on Neural Networks and Learning Systems | en_US |
dc.subject | Data stream | en_US |
dc.subject | Ensemble cardinality | en_US |
dc.subject | Ensemble size | en_US |
dc.subject | Law of diminishing returns | en_US |
dc.subject | Majority voting (MV) | en_US |
dc.subject | Supervised learning | en_US |
dc.subject | Voting framework | en_US |
dc.subject | Weighted MV (WMV) | en_US |
dc.title | Less is more: a comprehensive framework for the number of components of ensemble classifiers | en_US |
dc.type | Article | en_US |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- Less_Is_More_A_Comprehensive_Framework_for_the_Number_of_Components_of_Ensemble_Classifiers.pdf
- Size:
- 1.64 MB
- Format:
- Adobe Portable Document Format
- Description:
License bundle
1 - 1 of 1
No Thumbnail Available
- Name:
- license.txt
- Size:
- 1.71 KB
- Format:
- Item-specific license agreed upon to submission
- Description: