论文标题
顺序梯度下降和准Newton的变更点分析方法
Sequential Gradient Descent and Quasi-Newton's Method for Change-Point Analysis
论文作者
论文摘要
检测变更点的一种常见方法是将成本函数最小化,而在可能的数字和更改点的位置上。该框架包括几个完善的程序,例如惩罚的可能性和最小描述长度。这种方法需要在数据集的不同段中重复找到成本值,当(i)数据序列长,并且(ii)获得成本价值涉及解决非平地优化问题时,这可能会很耗时。本文介绍了一种新的顺序方法(SE),该方法可以与梯度下降(SEGD)和Quasi-Newton的方法(SEN)相结合,以有效地找到成本价值。核心想法是使用上述步骤中的信息来更新成本值,而不会重新提高目标函数。新方法应用于广义线性模型中的更改点检测和惩罚回归。数值研究表明,新方法的数量级可以比修剪精确的线性时间(PELT)方法快,而无需牺牲估计精度。
One common approach to detecting change-points is minimizing a cost function over possible numbers and locations of change-points. The framework includes several well-established procedures, such as the penalized likelihood and minimum description length. Such an approach requires finding the cost value repeatedly over different segments of the data set, which can be time-consuming when (i) the data sequence is long and (ii) obtaining the cost value involves solving a non-trivial optimization problem. This paper introduces a new sequential method (SE) that can be coupled with gradient descent (SeGD) and quasi-Newton's method (SeN) to find the cost value effectively. The core idea is to update the cost value using the information from previous steps without re-optimizing the objective function. The new method is applied to change-point detection in generalized linear models and penalized regression. Numerical studies show that the new approach can be orders of magnitude faster than the Pruned Exact Linear Time (PELT) method without sacrificing estimation accuracy.