论文标题

大规模相等圆形包装问题的随机项目下降方法

Stochastic Item Descent Method for Large Scale Equal Circle Packing Problem

论文作者

He, Kun, Zhang, Min, Zhou, Jianrong, Jin, Yan, Li, Chu-min

论文摘要

随机梯度下降(SGD)是机器学习领域大规模优化问题的有力方法,尤其是对于具有许多变量的有限和配方。近年来,Mini Batch SGD取得了巨大的成功,并已成为培训用大量数据培训深层神经网络的标准技术。受到深度学习成功的启发,我们将SGD的想法应用于决策版本中的经典优化问题。给定$ n $单位圆圈,相等的圆形包装问题(ECPP)询问是否存在可行的包装,可以将所有圆圈放入圆形容器中而不会重叠。具体而言,我们提出了一个大规模ECPP的随机项目下降法(SIDM),该方法将单位圆随机分为批处理并运行Broyden-fletcher-goldfarb-shanno(BFGS)算法,该算法在相应的批次函数上迭代地迭代地加速了计算。我们还增加了批次迭代期间的批量尺寸,以获得更高质量的解决方案。与当前的最佳包装算法相比,SIDM极大地加快了优化过程的计算,并保证了最多1500个圆圈项目的大型实例的解决方案质量,而基线算法通常处理约300个圆形项目。结果表明,大规模的经典优化问题SIDM的高度效率,并显示了其他大规模经典优化问题的潜力,其中梯度下降用于优化。

Stochastic gradient descent (SGD) is a powerful method for large-scale optimization problems in the area of machine learning, especially for a finite-sum formulation with numerous variables. In recent years, mini-batch SGD gains great success and has become a standard technique for training deep neural networks fed with big amount of data. Inspired by its success in deep learning, we apply the idea of SGD with batch selection of samples to a classic optimization problem in decision version. Given $n$ unit circles, the equal circle packing problem (ECPP) asks whether there exist a feasible packing that could put all the circles inside a circular container without overlapping. Specifically, we propose a stochastic item descent method (SIDM) for ECPP in large scale, which randomly divides the unit circles into batches and runs Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm on the corresponding batch function iteratively to speedup the calculation. We also increase the batch size during the batch iterations to gain higher quality solution. Comparing to the current best packing algorithms, SIDM greatly speeds up the calculation of optimization process and guarantees the solution quality for large scale instances with up to 1500 circle items, while the baseline algorithms usually handle about 300 circle items. The results indicate the highly efficiency of SIDM for this classic optimization problem in large scale, and show potential for other large scale classic optimization problems in which gradient descent is used for optimization.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源