论文标题

随机路径综合估计器期望最大化算法

A Stochastic Path-Integrated Differential EstimatoR Expectation Maximization Algorithm

论文作者

Fort, Gersende, Moulines, Eric, Wai, Hoi-To

论文摘要

预期最大化(EM)算法对于推断潜在变量模型(包括回归器和专家的混合物)(缺少观察值)至关重要。本文介绍了一种小说EM算法,称为\ texttt {spider-em},以推断出大小$ n $,$ n \ gg 1 $的训练集。我们算法的核心是{\ sf e} - 步骤中的完整条件期望的估计器,它是根据随机路径综合差异估计器({\ tt蜘蛛})技术进行了调整的。我们得出有限的时间复杂性界限,以使其平稳非凸的可能性:我们表明,对于$ε$ - $ x的固定点,复杂性缩放为$ k _ {\ operatotorname {optatorname {opt}}}}(n,h,ε)= {\ cal o}(ε^{ε^{ε^{ - 1} $ __________________________ n,ε)= n+ \ sqrt {n} {\ cal o}(ε^{ - 1})$,其中$ k _ {\ propatatorName {optatorname {opt}}(n,ε)$和$ k _ {\ k _ {\ permatatorname {ce {ce}}}}}}(ce}}}}(n,n,h,ε)$ and norky norky and and normy norky and Nogds and {按样本的条件期望评估。这改进了最新的算法。数值结果支持我们的发现。

The Expectation Maximization (EM) algorithm is of key importance for inference in latent variable models including mixture of regressors and experts, missing observations. This paper introduces a novel EM algorithm, called \texttt{SPIDER-EM}, for inference from a training set of size $n$, $n \gg 1$. At the core of our algorithm is an estimator of the full conditional expectation in the {\sf E}-step, adapted from the stochastic path-integrated differential estimator ({\tt SPIDER}) technique. We derive finite-time complexity bounds for smooth non-convex likelihood: we show that for convergence to an $ε$-approximate stationary point, the complexity scales as $K_{\operatorname{Opt}} (n,ε)={\cal O}(ε^{-1})$ and $K_{\operatorname{CE}}( n,ε) = n+ \sqrt{n} {\cal O}(ε^{-1} )$, where $K_{\operatorname{Opt}}( n,ε)$ and $K_{\operatorname{CE}}(n, ε)$ are respectively the number of {\sf M}-steps and the number of per-sample conditional expectations evaluations. This improves over the state-of-the-art algorithms. Numerical results support our findings.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源