Trainable fractional Fourier transform
buir.contributor.author | Koç, Emirhan | |
buir.contributor.author | Alikaşifoğlu , Tuna | |
buir.contributor.author | Aras, Arda Can | |
buir.contributor.author | Koç, Aykut | |
buir.contributor.orcid | Koç, Emirhan|0000-0002-7275-1570 | |
buir.contributor.orcid | Alikaşifoğlu, Tuna|0000-0001-8030-8088 | |
buir.contributor.orcid | Aras, Arda Can|0009-0000-0378-1779 | |
buir.contributor.orcid | Koç, Aykut|0000-0002-6348-2663 | |
dc.citation.epage | 755 | |
dc.citation.spage | 751 | |
dc.citation.volumeNumber | 31 | |
dc.contributor.author | Koç, Emirhan | |
dc.contributor.author | Alikaşifoğlu, Tuna | |
dc.contributor.author | Aras, Arda Can | |
dc.contributor.author | Koç, Aykut | |
dc.date.accessioned | 2025-02-20T12:54:49Z | |
dc.date.available | 2025-02-20T12:54:49Z | |
dc.date.issued | 2024 | |
dc.department | Department of Electrical and Electronics Engineering | |
dc.department | National Magnetic Resonance Research Center (UMRAM) | |
dc.description.abstract | Recently, the fractional Fourier transform (FrFT) has been integrated into distinct deep neural network (DNN) models such as transformers, sequence models, and convolutional neural networks (CNNs). However, in previous works, the fraction order $\boldsymbol{a}$ is merely considered a hyperparameter and selected heuristically or tuned manually to find the suitable values, which hinders the applicability of FrFT in deep neural networks. We extend the scope of FrFT and introduce it as a trainable layer in neural network architectures, where $\boldsymbol{a}$ is learned in the training stage along with the network weights. We mathematically show that $\boldsymbol{a}$ can be updated in any neural network architecture through backpropagation in the network training phase. We also conduct extensive experiments on benchmark datasets encompassing image classification and time series prediction tasks. Our results show that the trainable FrFT layers alleviate the need to search for suitable $\boldsymbol{a}$ and improve performance over time and Fourier domain approaches. We share our publicly available source codes for reproducibility. | |
dc.description.provenance | Submitted by İsmail Akdağ (ismail.akdag@bilkent.edu.tr) on 2025-02-20T12:54:49Z No. of bitstreams: 1 Trainable_Fractional_Fourier_Transform.pdf: 824758 bytes, checksum: ff2d9a96f790f0ec8d60b36077b2563c (MD5) | en |
dc.description.provenance | Made available in DSpace on 2025-02-20T12:54:49Z (GMT). No. of bitstreams: 1 Trainable_Fractional_Fourier_Transform.pdf: 824758 bytes, checksum: ff2d9a96f790f0ec8d60b36077b2563c (MD5) Previous issue date: 2024 | en |
dc.identifier.doi | 10.1109/LSP.2024.3372779 | |
dc.identifier.eissn | 1558-2361 | |
dc.identifier.issn | 1070-9908 | |
dc.identifier.uri | https://hdl.handle.net/11693/116509 | |
dc.language.iso | English | |
dc.publisher | IEEE | |
dc.relation.isversionof | https://dx.doi.org/10.1109/LSP.2024.3372779 | |
dc.rights | CC BY-NC-ND (Attribution-NonCommercial-NoDerivs 4.0 International) | |
dc.rights.uri | https://creativecommons.org/licenses/by-nc-nd/4.0/deed.en | |
dc.source.title | IEEE Signal Processing Letters | |
dc.subject | Machine learning | |
dc.subject | Neural networks | |
dc.subject | FT | |
dc.subject | Fractional FT | |
dc.subject | Deep learning | |
dc.title | Trainable fractional Fourier transform | |
dc.type | Article |