Learning feature-projection based classifiers

Date
2012-03
Authors
Dayanik, A.
Editor(s)
Advisor
Supervisor
Co-Advisor
Co-Supervisor
Instructor
Source Title
Expert Systems with Applications: an international journal
Print ISSN
0957-4174
Electronic ISSN
Publisher
Pergamon Press
Volume
39
Issue
4
Pages
4532 - 4544
Language
English
Journal Title
Journal ISSN
Volume Title
Series
Abstract

This paper aims at designing better performing feature-projection based classification algorithms and presents two new such algorithms. These algorithms are batch supervised learning algorithms and represent induced classification knowledge as feature intervals. In both algorithms, each feature participates in the classification by giving real-valued votes to classes. The prediction for an unseen example is the class receiving the highest vote. The first algorithm, OFP.MC, learns on each feature pairwise disjoint intervals which minimize feature classification error. The second algorithm. GFP.MC, constructs feature intervals by greedily improving the feature classification error. The new algorithms are empirically evaluated on twenty datasets from the UCI repository and are compared with the existing feature-projection based classification algorithms (FILIF, VFI5, CFP, k-NNFP, and NBC). The experiments demonstrate that the OFP.MC algorithm outperforms other feature-projection based classification algorithms. The GFP.MC algorithm is slightly inferior to the OFP.MC algorithm, but, if it is used for datasets with large number of instances, then it reduces the space requirement of the OFP.MC algorithm. The new algorithms are insensitive to boundary noise unlike the other feature-projection based classification algorithms considered here. (C) 2011 Elsevier Ltd. All rights reserved.

Course
Other identifiers
Book Title
Citation
Published Version (Please cite this version)