Browsing by Subject "Adaptive learning"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item Open Access An efficient and effective second-order training algorithm for LSTM-based adaptive learning(IEEE, 2021-04-07) Vural, N. Mert; Ergüt, S.; Kozat, Süleyman S.We study adaptive (or online) nonlinear regression with Long-Short-Term-Memory (LSTM) based networks, i.e., LSTM-based adaptive learning. In this context, we introduce an efficient Extended Kalman filter (EKF) based second-order training algorithm. Our algorithm is truly online, i.e., it does not assume any underlying data generating process and future information, except that the target sequence is bounded. Through an extensive set of experiments, we demonstrate significant performance gains achieved by our algorithm with respect to the state-of-the-art methods. Here, we mainly show that our algorithm consistently provides 10 to 45% improvement in the accuracy compared to the widely-used adaptive methods Adam, RMSprop, and DEKF, and comparable performance to EKF with a 10 to 15 times reduction in the run-time.Item Open Access Efficient estimation of graph signals with adaptive sampling(IEEE, 2020) Ahmadi, Mohammad Javad; Arablouei, R.; Abdolee, R.We propose two new least mean squares (LMS)-based algorithms for adaptive estimation of graph signals that improve the convergence speed of the LMS algorithm while preserving its low computational complexity. The first algorithm, named extended least mean squares (ELMS), extends the LMS algorithm by virtue of reusing the signal vectors of previous iterations alongside the signal available at the current iteration. Utilizing the previous signal vectors accelerates the convergence of the ELMS algorithm at the expense of higher steady-state error compared to the LMS algorithm. To further improve the performance, we propose the fast ELMS (FELMS) algorithm in which the influence of the signal vectors of previous iterations is controlled by optimizing the gradient of the mean-square deviation (GMSD). The FELMS algorithm converges faster than the ELMS algorithm and has steady-state errors comparable to that of the LMS algorithm. We analyze the mean-square performance of ELMS and FELMS algorithms theoretically and derive the respective convergence conditions as well as the predicted MSD values. In addition, we present an adaptive sampling strategy in which the sampling probability of each node is changed according to the GMSD of the node. Computer simulations using both synthetic and real data validate the theoretical results and demonstrate the merits of the proposed algorithms.