Benefit maximizing classification using feature intervals
buir.advisor | Güvenir, Altay | |
dc.contributor.author | İkizler, Nazlı | |
dc.date.accessioned | 2016-07-01T10:56:13Z | |
dc.date.available | 2016-07-01T10:56:13Z | |
dc.date.issued | 2002 | |
dc.description | Cataloged from PDF version of article. | en_US |
dc.description.abstract | For a long time, classification algorithms have focused on minimizing the quantity of prediction errors by assuming that each possible error has identical consequences. However, in many real-world situations, this assumption is not convenient. For instance, in a medical diagnosis domain, misdiagnosing a sick patient as healthy is much more serious than its opposite. For this reason, there is a great need for new classification methods that can handle asymmetric cost and benefit constraints of classifications. In this thesis, we discuss cost-sensitive classification concepts and propose a new classification algorithm called Benefit Maximization with Feature Intervals (BMFI) that uses the feature projection based knowledge representation. In the framework of BMFI, we introduce five different voting methods that are shown to be effective over different domains. A number of generalization and pruning methodologies based on benefits of classification are implemented and experimented. Empirical evaluation of the methods has shown that BMFI exhibits promising performance results compared to recent wrapper cost-sensitive algorithms, despite the fact that classifier performance is highly dependent on the benefit constraints and class distributions in the domain. In order to evaluate costsensitive classification techniques, we describe a new metric, namely benefit accuracy which computes the relative accuracy of the total benefit obtained with respect to the maximum possible benefit achievable in the domain. | en_US |
dc.description.provenance | Made available in DSpace on 2016-07-01T10:56:13Z (GMT). No. of bitstreams: 1 0002182.pdf: 413488 bytes, checksum: 69f4ff08a704baf17b5f6e01092830ee (MD5) Previous issue date: 2002 | en |
dc.description.statementofresponsibility | İkizler, Nazlı | en_US |
dc.format.extent | xiv, 109 leaves, tables, graphs, 30 cm | en_US |
dc.identifier.itemid | BILKUTUPB067736 | |
dc.identifier.uri | http://hdl.handle.net/11693/29234 | |
dc.language.iso | English | en_US |
dc.rights | info:eu-repo/semantics/openAccess | en_US |
dc.subject | machine learning | en_US |
dc.subject | classification | en_US |
dc.subject | cost-sensitivity | en_US |
dc.subject | benefit maximization | en_US |
dc.subject | feature intervals | en_US |
dc.subject | voting | en_US |
dc.subject | pruning | en_US |
dc.subject.lcc | Q325.5 .I35 2002 | en_US |
dc.subject.lcsh | Machine learning. | en_US |
dc.title | Benefit maximizing classification using feature intervals | en_US |
dc.type | Thesis | en_US |
thesis.degree.discipline | Computer Engineering | |
thesis.degree.grantor | Bilkent University | |
thesis.degree.level | Master's | |
thesis.degree.name | MS (Master of Science) |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- 0002182.pdf
- Size:
- 403.8 KB
- Format:
- Adobe Portable Document Format
- Description:
- Full printable version