Achieving online regression performance of LSTMs with simple RNNs

buir.contributor.authorVural, Nuri Mert
buir.contributor.authorİlhan, Fatih
buir.contributor.authorYılmaz, Selim Fırat
buir.contributor.authorKozat, Süleyman Serdar
buir.contributor.orcidVural, Nuri Mert|0000-0002-2820-2806
buir.contributor.orcidİlhan, Fatih|0000-0002-0173-7544
buir.contributor.orcidYılmaz, Selim Fırat|0000-0002-0486-7731
buir.contributor.orcidKozat, Süleyman Serdar|0000-0002-6488-3848
dc.citation.epage12en_US
dc.citation.spage1en_US
dc.contributor.authorVural, Nuri Mert
dc.contributor.authorİlhan, Fatih
dc.contributor.authorYılmaz, Selim Fırat
dc.contributor.authorErgüt, S.
dc.contributor.authorKozat, Süleyman Serdar
dc.date.accessioned2022-03-04T08:30:34Z
dc.date.available2022-03-04T08:30:34Z
dc.date.issued2021-06-17
dc.departmentDepartment of Electrical and Electronics Engineeringen_US
dc.description.abstractRecurrent neural networks (RNNs) are widely used for online regression due to their ability to generalize nonlinear temporal dependencies. As an RNN model, long short-term memory networks (LSTMs) are commonly preferred in practice, as these networks are capable of learning long-term dependencies while avoiding the vanishing gradient problem. However, due to their large number of parameters, training LSTMs requires considerably longer training time compared to simple RNNs (SRNNs). In this article, we achieve the online regression performance of LSTMs with SRNNs efficiently. To this end, we introduce a first-order training algorithm with a linear time complexity in the number of parameters. We show that when SRNNs are trained with our algorithm, they provide very similar regression performance with the LSTMs in two to three times shorter training time. We provide strong theoretical analysis to support our experimental results by providing regret bounds on the convergence rate of our algorithm. Through an extensive set of experiments, we verify our theoretical work and demonstrate significant performance improvements of our algorithm with respect to LSTMs and the other state-of-the-art learning models.en_US
dc.description.provenanceSubmitted by Dilan Ayverdi (dilan.ayverdi@bilkent.edu.tr) on 2022-03-04T08:30:34Z No. of bitstreams: 1 Achieving_online_regression_performance_of_LSTMs_with_simple_RNNs.pdf: 1772511 bytes, checksum: 5e0f9cdc41c06362d2cbcaacc259d409 (MD5)en
dc.description.provenanceMade available in DSpace on 2022-03-04T08:30:34Z (GMT). No. of bitstreams: 1 Achieving_online_regression_performance_of_LSTMs_with_simple_RNNs.pdf: 1772511 bytes, checksum: 5e0f9cdc41c06362d2cbcaacc259d409 (MD5) Previous issue date: 2021-06-17en
dc.identifier.doi10.1109/TNNLS.2021.3086029en_US
dc.identifier.eissn2162-2388
dc.identifier.issn2162-237X
dc.identifier.urihttp://hdl.handle.net/11693/77682
dc.language.isoEnglishen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.relation.isversionofhttps://doi.org/10.1109/TNNLS.2021.3086029en_US
dc.source.titleIEEE Transactions on Neural Networks and Learning Systemsen_US
dc.subjectNeural network trainingen_US
dc.subjectOnline gradient descenten_US
dc.subjectOnline learningen_US
dc.subjectRecurrent neural networks (RNNs)en_US
dc.subjectRegressionen_US
dc.titleAchieving online regression performance of LSTMs with simple RNNsen_US
dc.typeArticleen_US

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Achieving_online_regression_performance_of_LSTMs_with_simple_RNNs.pdf
Size:
1.69 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.69 KB
Format:
Item-specific license agreed upon to submission
Description: