Subset based error recovery
buir.contributor.author | Ekmekcioğlu, Ömer | |
buir.contributor.author | Akkaya, Deniz | |
buir.contributor.author | Pınar, Mustafa Çelebi | |
buir.contributor.orcid | Akkaya, Deniz|0000-0002-7578-2516 | |
dc.citation.epage | 108361- 8 | en_US |
dc.citation.spage | 108361- 1 | en_US |
dc.citation.volumeNumber | 191 | en_US |
dc.contributor.author | Ekmekcioğlu, Ömer | |
dc.contributor.author | Akkaya, Deniz | |
dc.contributor.author | Pınar, Mustafa Çelebi | |
dc.date.accessioned | 2023-02-16T07:34:30Z | |
dc.date.available | 2023-02-16T07:34:30Z | |
dc.date.issued | 2021-10-12 | |
dc.department | Department of Industrial Engineering | en_US |
dc.description.abstract | We propose a data denoising method using Extreme Learning Machine (ELM) structure which allows us to use Johnson-Lindenstrauß Lemma (JL) for preserving Restricted Isometry Property (RIP) in order to give theoretical guarantees for recovery. Furthermore, we show that the method is equivalent to a robust two-layer ELM that implicitly benefits from the proposed denoising algorithm. Current robust ELM methods in the literature involve well-studied L1, L2 regularization techniques as well as the usage of the robust loss functions such as Huber Loss. We extend the recent analysis on the Robust Regression literature to be effectively used in more general, non-linear settings and to be compatible with any ML algorithm such as Neural Networks (NN). These methods are useful under the scenario where the observations suffer from the effect of heavy noise. We extend the usage of ELM as a general data denoising method independent of the ML algorithm. Tests for denoising and regularized ELM methods are conducted on both synthetic and real data. Our method performs better than its competitors for most of the scenarios, and successfully eliminates most of the noise. | en_US |
dc.description.provenance | Submitted by Ezgi Uğurlu (ezgi.ugurlu@bilkent.edu.tr) on 2023-02-16T07:34:30Z No. of bitstreams: 1 Subset_based_error_recovery.pdf: 752954 bytes, checksum: 60a5c8ccad9a9251e30d5382547398a2 (MD5) | en |
dc.description.provenance | Made available in DSpace on 2023-02-16T07:34:30Z (GMT). No. of bitstreams: 1 Subset_based_error_recovery.pdf: 752954 bytes, checksum: 60a5c8ccad9a9251e30d5382547398a2 (MD5) Previous issue date: 2021-10-12 | en |
dc.embargo.release | 2023-10-12 | |
dc.identifier.doi | 10.1016/j.sigpro.2021.108361 | en_US |
dc.identifier.eissn | 1872-7557 | |
dc.identifier.issn | 0165-1684 | |
dc.identifier.uri | http://hdl.handle.net/11693/111396 | |
dc.language.iso | English | en_US |
dc.publisher | Elsevier BV | en_US |
dc.relation.isversionof | https://doi.org/10.1016/j.sigpro.2021.108361 | en_US |
dc.source.title | Signal Processing | en_US |
dc.subject | Robust networks | en_US |
dc.subject | Extreme learning machine | en_US |
dc.subject | Sparse recovery | en_US |
dc.subject | Regularization | en_US |
dc.subject | Hard thresholding | en_US |
dc.title | Subset based error recovery | en_US |
dc.type | Article | en_US |