Trainable fractional Fourier transform

Date

2024

Editor(s)

Advisor

Supervisor

Co-Advisor

Co-Supervisor

Instructor

Citation Stats

Series

Abstract

Recently, the fractional Fourier transform (FrFT) has been integrated into distinct deep neural network (DNN) models such as transformers, sequence models, and convolutional neural networks (CNNs). However, in previous works, the fraction order a is merely considered a hyperparameter and selected heuristically or tuned manually to find the suitable values, which hinders the applicability of FrFT in deep neural networks. We extend the scope of FrFT and introduce it as a trainable layer in neural network architectures, where a is learned in the training stage along with the network weights. We mathematically show that a can be updated in any neural network architecture through backpropagation in the network training phase. We also conduct extensive experiments on benchmark datasets encompassing image classification and time series prediction tasks. Our results show that the trainable FrFT layers alleviate the need to search for suitable a and improve performance over time and Fourier domain approaches. We share our publicly available source codes for reproducibility.

Source Title

IEEE Signal Processing Letters

Publisher

IEEE

Course

Other identifiers

Book Title

Degree Discipline

Degree Level

Degree Name

Citation

Published Version (Please cite this version)

Language

English