Online additive updates with FFT-IFFT operator on the GRU neural networks

Date
2018
Editor(s)
Advisor
Supervisor
Co-Advisor
Co-Supervisor
Instructor
Source Title
2018 26th Signal Processing and Communications Applications Conference (SIU)
Print ISSN
Electronic ISSN
Publisher
IEEE
Volume
Issue
Pages
Language
English
Journal Title
Journal ISSN
Volume Title
Series
Abstract

In this paper, we derived the online additive updates of gated recurrent unit (GRU) network by using fast fourier transform-inverse fast fourier transform (FFT-IFFT) operator. In the gating process of the GRU networks, we work in the frequency domain and execute all the linear operations. For the non-linear functions in the gating process, we first shift back to the time domain and then apply non-linear GRU gating functions. Furthermore, in order to reduce the computational complexity and speed up the training process, we apply weight matrix factorization (WMF) on the FFT-IFFT variant GRU network. We then compute the online additive updates of the FFT-WMF based GRU networks using stochastic gradient descent (SGD) algorithm. We also used long short-term memory (LSTM) networks in place of the GRU networks. Through an extensive set of experiments, we illustrate that our proposed algorithm achieves a significant increase in performance with a decrease in computational complexity.

Course
Other identifiers
Book Title
Citation
Published Version (Please cite this version)