Browsing by Subject "Long short term memory (LSTM)"
Now showing 1 - 3 of 3
- Results Per Page
- Sort Options
Item Open Access Deep learning in electronic warfare systems: automatic pulse detection and intra-pulse modulation recognition(2020-12) Akyon, Fatih CagatayDetection and classification of radar systems based on modulation analysis on pulses they transmit is an important application in electronic warfare systems. Many of the present works focus on classifying modulations assuming signal detection is done beforehand without providing any detection method. In this work, we propose two novel deep-learning based techniques for automatic pulse detection and intra-pulse modulation recognition of radar signals. As the first nechnique, an LSTM based multi-task learning model is proposed for end-to-end pulse detection and modulation classification. As the second technique, re-assigned spectrogram of measured radar signal and detected outliers of its instantaneous phases filtered by a special function are used for training multiple convolutional neural networks. Automatically extracted features from the networks are fused to distinguish frequency and phase modulated signals. Another major issue on this area is the training and evaluation of supervised neural network based models. To overcome this issue we have developed an Intentional Modulation on Pulse (IMOP) measurement simulator which can generate over 15 main phase and frequency modulations with realistic pulses and noises. Simulation results show that the proposed FFCNN and MODNET techniques outperform the current stateof-the-art alternatives and is easily scalable among broad range of modulation types.Item Open Access An efficient and effective second-order training algorithm for LSTM-based adaptive learning(IEEE, 2021-04-07) Vural, N. Mert; Ergüt, S.; Kozat, Süleyman S.We study adaptive (or online) nonlinear regression with Long-Short-Term-Memory (LSTM) based networks, i.e., LSTM-based adaptive learning. In this context, we introduce an efficient Extended Kalman filter (EKF) based second-order training algorithm. Our algorithm is truly online, i.e., it does not assume any underlying data generating process and future information, except that the target sequence is bounded. Through an extensive set of experiments, we demonstrate significant performance gains achieved by our algorithm with respect to the state-of-the-art methods. Here, we mainly show that our algorithm consistently provides 10 to 45% improvement in the accuracy compared to the widely-used adaptive methods Adam, RMSprop, and DEKF, and comparable performance to EKF with a 10 to 15 times reduction in the run-time.Item Open Access Efficient online learning algorithms based on LSTM neural networks(Institute of Electrical and Electronics Engineers, 2018) Ergen, Tolga; Kozat, Süleyman SerdarWe investigate online nonlinear regression and introduce novel regression structures based on the long short term memory (LSTM) networks. For the introduced structures, we also provide highly efficient and effective online training methods. To train these novel LSTM-based structures, we put the underlying architecture in a state space form and introduce highly efficient and effective particle filtering (PF)-based updates. We also provide stochastic gradient descent and extended Kalman filter-based updates. Our PF-based training method guarantees convergence to the optimal parameter estimation in the mean square error sense provided that we have a sufficient number of particles and satisfy certain technical conditions. More importantly, we achieve this performance with a computational complexity in the order of the first-order gradient-based methods by controlling the number of particles. Since our approach is generic, we also introduce a gated recurrent unit (GRU)-based approach by directly replacing the LSTM architecture with the GRU architecture, where we demonstrate the superiority of our LSTM-based approach in the sequential prediction task via different real life data sets. In addition, the experimental results illustrate significant performance improvements achieved by the introduced algorithms with respect to the conventional methods over several different benchmark real life data sets.