Browsing by Subject "Long short term memory"
Now showing 1 - 3 of 3
- Results Per Page
- Sort Options
Item Open Access Enhancing human operator performance with long short-term memory networks in adaptively controlled systems(Institute of Electrical and Electronics Engineers, 2023-11-20) Uzun, M. Yusuf; İnanç, Emirhan; Yıldız, YıldırayThe focus of this letter is developing a Long Short-Term Memory (LSTM) network-based control framework that works in collaboration with the human operator to enhance the overall closed-loop system performance in adaptively controlled systems. The domain of investigation is chosen to be flight control, although the proposed approach can be generalized for other domains such as automotive control. In accordance with this choice, an adaptive human pilot model is used as the mathematical representation of the pilot during the technical development of the method. An LSTM network is designed in such a way that it predicts and compensates for the inadequacies of the human operator's decisions while they fly an aircraft that has an adaptive inner loop controller. The simulation results demonstrate that the tracking performance is improved, and the pilot workload is reduced.Item Open Access A novel anomaly detection approach based on neural networks(Institute of Electrical and Electronics Engineers, 2018) Ergen, Tolga; Kerpiççi, MineIn this paper, we introduce a Long Short Term Memory (LSTM) networks based anomaly detection algorithm, which works in an unsupervised framework. We first introduce LSTM based structure for variable length data sequences to obtain fixed length sequences. Then, we propose One Class Support Vector Machines (OC-SVM) algorithm based scoring function for anomaly detection. For training, we propose a gradient based algorithm to find the optimal parameters for both LSTM architecture and the OC-SVM formulation. Since we modify the original OC-SVM formulation, we also provide the convergence results of the modified formulation to the original one. Thus, the algorithm that we proposed is able to process data with variable length sequences. Also, the algorithm provides high performance for time series data. In our experiments, we illustrate significant performance improvements with respect to the conventional methods.Item Open Access Text categorization using syllables and recurrent neural networks(2017-07) Yar, ErsinWe investigate multi class categorization of short texts. To this end, in the third chapter, we introduce highly efficient dimensionality reduction techniques suitable for online processing of high dimensional feature vectors generated from freely-worded text. Although text processing and classification are highly important due to many applications such as emotion recognition, advertisement selection, etc., online classification and regression algorithms over text are limited due to need for high dimensional vectors to represent natural text inputs. We overcome such limitations by showing that randomized projections and piecewise linear models can be efficiently leveraged to significantly reduce the computational cost for feature vector extraction from the tweets. We demonstrate our results over tweets collected from a real life case study where the tweets are freely-worded and unstructured. We implement several well-known machine learning algorithms as well as novel regression methods and demonstrate that we can significantly reduce the computational complexity with insignificant change in the classification and regression performance.Furthermore, in the fourth chapter, we introduce a simple and novel technique for short text classification based on LSTM neural networks. Our algorithm obtains two distributed representations for a short text to be used in classification task. We derive one representation by processing vector embeddings corresponding to words consecutively in LSTM structure and taking average of the produced outputs at each time step of the network. We also take average of distributed representations of the words in the short text to obtain the other representation. For classification, weighted combination of both representations are calculated. Moreover, for the first time in literature we propose to use syllables to exploit the sequential nature of the data in a better way. We derive distributed representations of the syllables and feed them to an LSTM network to obtain the distributed representation for the short text. Softmax layer is used to calculate categorical distribution at the end. Classification performance is evaluated in terms of AUC measure. Experiments show that utilizing two distributed representations improves classification performance by 2%. Furthermore, we demonstrate that using distributed representations of syllables in short text categorization also provides performance improvements.