Global Convergence of the EM Algorithm for Unconstrained Latent Variable Models with Categorical Indicators
Autor: | Alexander Weissman |
---|---|
Rok vydání: | 2012 |
Předmět: |
Mathematical optimization
Models Statistical Kullback–Leibler divergence Psychometrics Probabilistic latent semantic analysis Applied Mathematics Latent variable Latent class model ComputingMethodologies_PATTERNRECOGNITION Data Interpretation Statistical Computer Science::Multimedia Expectation–maximization algorithm Convex optimization Humans Likelihood function Categorical variable Algorithms General Psychology Mathematics |
Zdroj: | Psychometrika. 78:134-153 |
ISSN: | 1860-0980 0033-3123 |
DOI: | 10.1007/s11336-012-9295-z |
Popis: | Convergence of the expectation-maximization (EM) algorithm to a global optimum of the marginal log likelihood function for unconstrained latent variable models with categorical indicators is presented. The sufficient conditions under which global convergence of the EM algorithm is attainable are provided in an information-theoretic context by interpreting the EM algorithm as alternating minimization of the Kullback-Leibler divergence between two convex sets. It is shown that these conditions are satisfied by an unconstrained latent class model, yielding an optimal bound against which more highly constrained models may be compared. |
Databáze: | OpenAIRE |
Externí odkaz: |