Mirza, Ali H.2019-02-212019-02-2120189781538615010http://hdl.handle.net/11693/50223Date of Conference: 2-5 May 2018In this paper, we derived the online additive updates of gated recurrent unit (GRU) network by using fast fourier transform-inverse fast fourier transform (FFT-IFFT) operator. In the gating process of the GRU networks, we work in the frequency domain and execute all the linear operations. For the non-linear functions in the gating process, we first shift back to the time domain and then apply non-linear GRU gating functions. Furthermore, in order to reduce the computational complexity and speed up the training process, we apply weight matrix factorization (WMF) on the FFT-IFFT variant GRU network. We then compute the online additive updates of the FFT-WMF based GRU networks using stochastic gradient descent (SGD) algorithm. We also used long short-term memory (LSTM) networks in place of the GRU networks. Through an extensive set of experiments, we illustrate that our proposed algorithm achieves a significant increase in performance with a decrease in computational complexity.EnglishFFTGRUOnline learningOnline updatesSGDOnline additive updates with FFT-IFFT operator on the GRU neural networksConference Paper10.1109/SIU.2018.8404456