Skip to Main content Skip to Navigation
Journal articles

Mini-batch learning of exponential family finite mixture models

Hien D Nguyen 1 Florence Forbes 2 Geoffrey Mclachlan 3
2 MISTIS - Modelling and Inference of Complex and Structured Stochastic Systems
Inria Grenoble - Rhône-Alpes, LJK - Laboratoire Jean Kuntzmann, INPG - Institut National Polytechnique de Grenoble
Abstract : Mini-batch algorithms have become increasingly popular due to the requirement for solving optimization problems, based on large-scale data sets. Using an existing online expectation-maximization (EM) algorithm framework, we demonstrate how mini-batch (MB) algorithms may be constructed, and propose a scheme for the stochastic stabilization of the constructed mini-batch algorithms. Theoretical results regarding the convergence of the mini-batch EM algorithms are presented. We then demonstrate how the mini-batch framework may be applied to conduct maximum likelihood (ML) estimation of mixtures of exponential family distributions, with emphasis on ML estimation for mixtures of normal distributions. Via a simulation study, we demonstrate that the mini-batch algorithm for mixtures of normal distributions can outperform the standard EM algorithm. Further evidence of the performance of the mini-batch framework is provided via an application to the famous MNIST data set.
Complete list of metadatas

Cited literature [53 references]  Display  Hide  Download
Contributor : Florence Forbes <>
Submitted on : Thursday, March 26, 2020 - 2:35:19 PM
Last modification on : Monday, March 30, 2020 - 3:30:10 PM


Files produced by the author(s)




Hien D Nguyen, Florence Forbes, Geoffrey Mclachlan. Mini-batch learning of exponential family finite mixture models. Statistics and Computing, Springer Verlag (Germany), inPress, pp.1-40. ⟨10.1007/s11222-019-09919-4⟩. ⟨hal-02415068v2⟩



Record views


Files downloads