Efficient online learning algorithms based on LSTM neural networks
Author
Ergen, T.
Kozat, S. S.
Date
2018Source Title
IEEE Transactions on Neural Networks and Learning Systems
Print ISSN
2162-237X
Publisher
Institute of Electrical and Electronics Engineers
Volume
29
Issue
8
Pages
3772 - 3783
Language
English
Type
ArticleItem Usage Stats
139
views
views
704
downloads
downloads
Abstract
We investigate online nonlinear regression and introduce novel regression structures based on the long short term memory (LSTM) networks. For the introduced structures, we also provide highly efficient and effective online training methods. To train these novel LSTM-based structures, we put the underlying architecture in a state space form and introduce highly efficient and effective particle filtering (PF)-based updates. We also provide stochastic gradient descent and extended Kalman filter-based updates. Our PF-based training method guarantees convergence to the optimal parameter estimation in the mean square error sense provided that we have a sufficient number of particles and satisfy certain technical conditions. More importantly, we achieve this performance with a computational complexity in the order of the first-order gradient-based methods by controlling the number of particles. Since our approach is generic, we also introduce a gated recurrent unit (GRU)-based approach by directly replacing the LSTM architecture with the GRU architecture, where we demonstrate the superiority of our LSTM-based approach in the sequential prediction task via different real life data sets. In addition, the experimental results illustrate significant performance improvements achieved by the introduced algorithms with respect to the conventional methods over several different benchmark real life data sets.
Keywords
Gated recurrent unit (GRU)Kalman filtering
Long short term memory (LSTM)
Online learning
Particle filtering (PF)
Regression
Stochastic gradient descent (SGD)