Yun Wei (Samsi and Duke University, USA) (webinar)
26 February 2021 @ 17:00 - 18:30
- Past event
“Obtaining faster convergence rates in finite mixture models by taking repeated measures”
Joint initiative with MIDAS Complex Data Modeling Research Network https://midas.mat.uc.cl/network/
Abstract: It is known that some finite mixture models suffer from slow rates for estimating the component parameters. Examples are mixtures of the weakly identifiable families in the sense of [Ho and Nguyen 2016]. To obtain faster parameter convergence rates, we propose to collect more samples from each mixture component, hence each data is a vector of samples from the same mixture component. Such a model is known in the literature as a finite mixture model of repeated measures, which has been applied in psychological study and topic modeling. This model also belongs to the mixture of product distributions, with the special structure that the product distributions in each mixture component are also identical. In this setup, each data consists of conditionally independent and identically distributed samples and thus is an exchangeable sequence.
We show that by taking repeated measures (collecting more samples from each mixture component), a finite mixture model that is not originally identifiable becomes identifiable. Moreover, the posterior contraction rates for the parameter estimation are also obtained, demonstrating that repeated measures are beneficial for estimating the component parameters. Our results hold for general probability families including all regular exponential families and can also be applied to hierarchical models. The key tool to develop the results is by establishing an inverse inequality to upper bound a suitable distance between mixing measures by the total variational distance between the corresponding mixture densities.
Based on joint work with Xuanlong Nguyen.