Browsing by Subject "Covariance"
Now showing 1 - 3 of 3
- Results Per Page
- Sort Options
Item Open Access Covariance function of a bivariate distribution function estimator for left truncated and right censored data(Institute of Statistical Science, Academia Sinica, 1998) Gijbelsi I.; Gürler, Ü.In left truncation and right censoring models one observes i.i.d. samples from the triplet (T, Z, δ) only if T ≤ Z, where Z = min(Y, C) and δ is one if Z = Y and zero otherwise. Here, Y is the variable of interest, T is the truncating variable and C is the censoring variable. Recently, Gürler and Gijbels (1996) proposed a nonparametric estimator for the bivariate distribution function when one of the components is subject to left truncation and right censoring. An asymptotic representation of this estimator as a mean of i.i.d. random variables with a negligible remainder term has been developed. This result establishes the convergence to a two time parameter Gaussian process. The covariance structure of the limiting process is quite complicated however, and is derived in this paper. We also consider the special case of censoring only. In this case the general expression for the variance function reduces to a simpler formula.Item Open Access Efficient online learning with improved LSTM neural networks(Elsevier, 2020-04-14) Mirza, Ali H.; Kerpiçci, Mine; Kozat, Süleyman S.We introduce efficient online learning algorithms based on the Long Short Term Memory (LSTM) networks that employ the covariance information. In particular, we introduce the covariance of the present and one-time step past input vectors into the gating structure of the LSTM networks. Additionally, we include the covariance of the output vector, and we learn their weight matrices to improve the learning performance of the LSTM networks where we also provide their updates. We reduce the number of system parameters through the weight matrix factorization where we convert the LSTM weight matrices into two smaller matrices in order to achieve high learning performance with low computational complexity. Moreover, we apply the introduced approach to the Gated Recurrent Unit (GRU) architecture. In our experiments, we illustrate significant performance improvements achieved by our methods on real-life datasets with respect to the vanilla LSTM and vanilla GRU networks.Item Open Access Low complexity efficient online learning algorithms using LSTM networks(2018-12) Mirza, Ali HassanIn this thesis, we implement efficient online learning algorithms using the Long Short Term Memory (LSTM) networks with low time and computational complexity. In Chapter 2, we investigate efficient covariance information-based online learning using the LSTM networks known as Co-LSTM networks. We utilize the covariance information into the LSTM gating structure and propose various effi- cient models. We reduce the computational complexity by applying the Weight Matrix Factorization (WMF) trick and derive the additive gradient based updates. In Chapter 3, we give a practical application of the network intrusion detection using the Co-LSTM networks. In Chapter 4, we propose a boosted binary version of Tree-LSTM networks which we call BBT-LSTM networks. We introduce the depth and windowing factor into the N-ary Tree-LSTM networks where each LSTM node is binarily split and the whole tree architecture grows in a balanced manner. In order to reduce the computational complexity of the BBT-LSTM networks, we apply WMF trick, replace the regular multiplication operator with the energy efficient operator and finally introduce the slicing operation on the BBT-LSTM network weight matrices. In Chapter 5, we propose another low complexity LSTM network based on a minimum number of hopping over the input data sequence. We study two methods to select the appropriate value of the hopping distance. Through an extensive set of experiments using the real-life data sets, we demonstrate the significant increase in the performance of the proposed algorithms at the end of each chapter.