论文标题
在随机镜下降与相互作用颗粒的下降:收敛性和差异降低
On stochastic mirror descent with interacting particles: convergence properties and variance reduction
论文作者
论文摘要
嘈杂信息优化的一个开放问题是计算与噪声量无关的精确最小化器。随机近似算法中的标准实践是使用降低的步进尺寸。但是,这导致收敛速度较慢。第二种选择是使用固定的阶梯尺寸并运行算法的独立复制品,并平均这些复制品。第三个选项是运行算法的复制品并允许它们进行交互。目前尚不清楚这些选项中的哪一个最有效。为了解决这个问题,我们减少了与相互作用颗粒的随机镜下降的研究,并将其计算具有嘈杂梯度信息的精确最小化器的计算问题。我们研究随机镜下降的收敛性,并明确说明交流和差异之间的权衡。我们提供理论和数值证据,以表明相互作用有助于改善收敛并减少估计值的方差。
An open problem in optimization with noisy information is the computation of an exact minimizer that is independent of the amount of noise. A standard practice in stochastic approximation algorithms is to use a decreasing step-size. This however leads to a slower convergence. A second alternative is to use a fixed step-size and run independent replicas of the algorithm and average these. A third option is to run replicas of the algorithm and allow them to interact. It is unclear which of these options works best. To address this question, we reduce the problem of the computation of an exact minimizer with noisy gradient information to the study of stochastic mirror descent with interacting particles. We study the convergence of stochastic mirror descent and make explicit the tradeoffs between communication and variance reduction. We provide theoretical and numerical evidence to suggest that interaction helps to improve convergence and reduce the variance of the estimate.