Sie können Bookmarks mittels Listen verwalten, loggen Sie sich dafür bitte in Ihr SLUB Benutzerkonto ein.
Medientyp:
E-Artikel
Titel:
Computation for Latent Variable Model Estimation: A Unified Stochastic Proximal Framework
Beteiligte:
Zhang, Siliang;
Chen, Yunxiao
Erschienen:
Springer Science and Business Media LLC, 2022
Erschienen in:
Psychometrika, 87 (2022) 4, Seite 1473-1502
Sprache:
Englisch
DOI:
10.1007/s11336-022-09863-9
ISSN:
0033-3123;
1860-0980
Entstehung:
Anmerkungen:
Beschreibung:
AbstractLatent variable models have been playing a central role in psychometrics and related fields. In many modern applications, the inference based on latent variable models involves one or several of the following features: (1) the presence of many latent variables, (2) the observed and latent variables being continuous, discrete, or a combination of both, (3) constraints on parameters, and (4) penalties on parameters to impose model parsimony. The estimation often involves maximizing an objective function based on a marginal likelihood/pseudo-likelihood, possibly with constraints and/or penalties on parameters. Solving this optimization problem is highly non-trivial, due to the complexities brought by the features mentioned above. Although several efficient algorithms have been proposed, there lacks a unified computational framework that takes all these features into account. In this paper, we fill the gap. Specifically, we provide a unified formulation for the optimization problem and then propose a quasi-Newton stochastic proximal algorithm. Theoretical properties of the proposed algorithms are established. The computational efficiency and robustness are shown by simulation studies under various settings for latent variable model estimation.