Predictive recursion (PR) is a fast stochastic algorithm for nonparametric estimation of mixing distributions in mixture models. It is known that the PR estimates of both the mixing and mixture densities are consistent under fairly mild conditions, but currently very little is known about the rate of convergence. Here I first investigate asymptotic convergence properties of the PR estimate under model misspecication in the special case of nite mixtures with known support. Tools from stochastic approximation theory are used to prove that the PR estimates converge, to the best Kullback-Leibler approximation, at a nearly root-n rate. When the support is unknown, PR can be used to construct an objective function which, when optimized, yields an estimate the support. I apply the known-support results to derive a rate of convergence for this modied PR estimate in the unknown support case, which compares favorably to known optimal rates.
History
Publisher Statement
NOTICE: this is the author’s version of a work that was accepted for publication in Statistics and Probability Letters. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Statistics and Probability Letters, Vol 82, Issue 2, Feb 2012. DOI: 10.1016/j.spl.2011.10.023