Trainable fractional Fourier transform

buir.contributor.authorKoç, Emirhan
buir.contributor.authorAlikaşifoğlu , Tuna
buir.contributor.authorAras, Arda Can
buir.contributor.authorKoç, Aykut
buir.contributor.orcidKoç, Emirhan|0000-0002-7275-1570
buir.contributor.orcidAlikaşifoğlu, Tuna|0000-0001-8030-8088
buir.contributor.orcidAras, Arda Can|0009-0000-0378-1779
buir.contributor.orcidKoç, Aykut|0000-0002-6348-2663
dc.citation.epage755
dc.citation.spage751
dc.citation.volumeNumber31
dc.contributor.authorKoç, Emirhan
dc.contributor.authorAlikaşifoğlu, Tuna
dc.contributor.authorAras, Arda Can
dc.contributor.authorKoç, Aykut
dc.date.accessioned2025-02-20T12:54:49Z
dc.date.available2025-02-20T12:54:49Z
dc.date.issued2024
dc.departmentDepartment of Electrical and Electronics Engineering
dc.departmentNational Magnetic Resonance Research Center (UMRAM)
dc.description.abstractRecently, the fractional Fourier transform (FrFT) has been integrated into distinct deep neural network (DNN) models such as transformers, sequence models, and convolutional neural networks (CNNs). However, in previous works, the fraction order $\boldsymbol{a}$ is merely considered a hyperparameter and selected heuristically or tuned manually to find the suitable values, which hinders the applicability of FrFT in deep neural networks. We extend the scope of FrFT and introduce it as a trainable layer in neural network architectures, where $\boldsymbol{a}$ is learned in the training stage along with the network weights. We mathematically show that $\boldsymbol{a}$ can be updated in any neural network architecture through backpropagation in the network training phase. We also conduct extensive experiments on benchmark datasets encompassing image classification and time series prediction tasks. Our results show that the trainable FrFT layers alleviate the need to search for suitable $\boldsymbol{a}$ and improve performance over time and Fourier domain approaches. We share our publicly available source codes for reproducibility.
dc.description.provenanceSubmitted by İsmail Akdağ (ismail.akdag@bilkent.edu.tr) on 2025-02-20T12:54:49Z No. of bitstreams: 1 Trainable_Fractional_Fourier_Transform.pdf: 824758 bytes, checksum: ff2d9a96f790f0ec8d60b36077b2563c (MD5)en
dc.description.provenanceMade available in DSpace on 2025-02-20T12:54:49Z (GMT). No. of bitstreams: 1 Trainable_Fractional_Fourier_Transform.pdf: 824758 bytes, checksum: ff2d9a96f790f0ec8d60b36077b2563c (MD5) Previous issue date: 2024en
dc.identifier.doi10.1109/LSP.2024.3372779
dc.identifier.eissn1558-2361
dc.identifier.issn1070-9908
dc.identifier.urihttps://hdl.handle.net/11693/116509
dc.language.isoEnglish
dc.publisherIEEE
dc.relation.isversionofhttps://dx.doi.org/10.1109/LSP.2024.3372779
dc.rightsCC BY-NC-ND (Attribution-NonCommercial-NoDerivs 4.0 International)
dc.rights.urihttps://creativecommons.org/licenses/by-nc-nd/4.0/deed.en
dc.source.titleIEEE Signal Processing Letters
dc.subjectMachine learning
dc.subjectNeural networks
dc.subjectFT
dc.subjectFractional FT
dc.subjectDeep learning
dc.titleTrainable fractional Fourier transform
dc.typeArticle

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Trainable_Fractional_Fourier_Transform.pdf
Size:
805.43 KB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description: