Browsing by Subject "Gaussian mixture models"
Now showing 1 - 2 of 2
- Results Per Page
- Sort Options
Item Open Access Maximum likelihood estimation of Gaussian mixture models using stochastic search(Elsevier BV, 2012) Ar, C.; Aksoy, S.; Arıkan, OrhanGaussian mixture models (GMM), commonly used in pattern recognition and machine learning, provide a flexible probabilistic model for the data. The conventional expectationmaximization (EM) algorithm for the maximum likelihood estimation of the parameters of GMMs is very sensitive to initialization and easily gets trapped in local maxima. Stochastic search algorithms have been popular alternatives for global optimization but their uses for GMM estimation have been limited to constrained models using identity or diagonal covariance matrices. Our major contributions in this paper are twofold. First, we present a novel parametrization for arbitrary covariance matrices that allow independent updating of individual parameters while retaining validity of the resultant matrices. Second, we propose an effective parameter matching technique to mitigate the issues related with the existence of multiple candidate solutions that are equivalent under permutations of the GMM components. Experiments on synthetic and real data sets show that the proposed framework has a robust performance and achieves significantly higher likelihood values than the EM algorithm. © 2012 Elsevier Ltd. All rights reserved.Item Open Access Maximum likelihood estimation of robust constrained Gaussian mixture models(2013) Arı, ÇağlarDensity estimation using Gaussian mixture models presents a fundamental trade off between the flexibility of the model and its sensitivity to the unwanted/unmodeled data points in the data set. The expectation maximization (EM) algorithm used to estimate the parameters of Gaussian mixture models is prone to local optima due to nonconvexity of the problem and the improper selection of parameterization. We propose a novel modeling framework, three different parameterizations and novel algorithms for the constrained Gaussian mixture density estimation problem based on the expectation maximization algorithm, convex duality theory and the stochastic search algorithms. We propose a new modeling framework called Constrained Gaussian Mixture Models (CGMM) that incorporates prior information into the density estimation problem in the form of convex constraints on the model parameters. In this context, we consider two different parameterizations where the first set of parameters are referred to as the information parameters and the second set of parameters are referred to as the source parameters. To estimate the parameters, we use the EM algorithm where we solve two optimization problems alternatingly in the E-step and the M-step. We show that the M-step corresponds to a convex optimization problem in the information parameters. We form a dual problem for the M-step and show that the dual problem corresponds to a convex optimization problem in the source parameters. We apply the CGMM framework to two different problems: Robust density estimation and compound object detection problems. In the robust density estimation problem, we incorporate the inlier/outlier information available for small number of data points as convex constraints on the parameters using the information parameters. In the compound object detection problem, we incorporate the relative size, spectral distribution structure and relative location relations of primitive objects as convex constraints on the parameters using the source parameters. Even with the propoper selection of the parameterization, density estimation problem for Gaussian mixture models is not jointly convex in both the E-step variables and the M-step variables. We propose a third parameterization based on eigenvalue decomposition of covariance matrices which is suitable for stochastic search algorithms in general and particle swarm optimization (PSO) algorithm in particular. We develop a new algorithm where global search skills of the PSO algorithm is incorporated into the EM algorithm to do global parameter estimation. In addition to the mathematical derivations, experimental results on synthetic and real-life data sets verifying the performance of the proposed algorithms are provided.