论文标题

分配强劲的贝叶斯优化,$φ$ -Diverences

Distributionally Robust Bayesian Optimization with $φ$-divergences

论文作者

Husain, Hisham, Nguyen, Vu, Hengel, Anton van den

论文摘要

鲁棒性的研究因其在许多系统面临不确定性的数据驱动设置中的必然性而受到了很多关注。关注的一个例子是贝叶斯优化(BO),其中不确定性是多方面的,但是只有有限的作品专用于这个方向。特别是Kirschner等人的工作。 (2020),它通过从DRO的镜头中抛出BO问题来弥合现有的分布鲁棒优化(DRO)的文献。尽管这项工作是开创性的,但它肯定会遭受各种实际的缺点,例如有限的背景假设,而遗留了主要问题,是否可以设计一种可以计算上可拖动的算法来解决此DRO-BO问题?在这项工作中,我们通过考虑$φ$ -DIVERGENCES中的数据迁移的鲁棒性来解决这个问题,该问题涵盖了许多流行的选择,例如$χ^2 $ ddivergence,Total Nakitiation,Total变化和现存的Kullback-Leibler-Leibler(KL)Divergence。我们表明,在这种设置中的DRO-BO问题等同于有限维优化问题,即使在连续的上下文设置中,也可以通过可证明的Sublinear遗憾界限轻松实现。然后,我们通过实验表明我们的方法超过了现有方法,证明了理论结果。

The study of robustness has received much attention due to its inevitability in data-driven settings where many systems face uncertainty. One such example of concern is Bayesian Optimization (BO), where uncertainty is multi-faceted, yet there only exists a limited number of works dedicated to this direction. In particular, there is the work of Kirschner et al. (2020), which bridges the existing literature of Distributionally Robust Optimization (DRO) by casting the BO problem from the lens of DRO. While this work is pioneering, it admittedly suffers from various practical shortcomings such as finite contexts assumptions, leaving behind the main question Can one devise a computationally tractable algorithm for solving this DRO-BO problem? In this work, we tackle this question to a large degree of generality by considering robustness against data-shift in $φ$-divergences, which subsumes many popular choices, such as the $χ^2$-divergence, Total Variation, and the extant Kullback-Leibler (KL) divergence. We show that the DRO-BO problem in this setting is equivalent to a finite-dimensional optimization problem which, even in the continuous context setting, can be easily implemented with provable sublinear regret bounds. We then show experimentally that our method surpasses existing methods, attesting to the theoretical results.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源