论文标题

一种增强的拉格朗日方法,具有约束生成的形状约束凸回归问题

An augmented Lagrangian method with constraint generation for shape-constrained convex regression problems

论文作者

Lin, Meixia, Sun, Defeng, Toh, Kim-Chuan

论文摘要

形状受限的凸回归问题涉及将凸功能拟合到观察到的数据,在该数据中施加了其他约束,例如组件的单调性和统一的Lipschitz连续性。本文提供了一个统一的框架,用于计算$ \ mathbb {r}^d $中多元形状约束凸回归函数的最小二乘估计器。我们证明,最小二乘估计器可以通过解决基本约束的凸二次编程(QP)问题来计算,其中$(d+1)n $变量,$ n(n-1)$线性不平等约束和$ n $可能是非polyhedradred nodral intrequality Boncectaints,其中$ n $ n $ n $ n是$ n $ n $ n是数据点的数量。为了有效地解决通常非常大的凸QP,我们设计了一种近端增强拉格朗日方法(Proxalm),其子问题通过半齿牛顿方法(SSN)解决。为了进一步加速计算,当$ n $巨大时,我们设计了约束生成方法的实际实现,以便我们提出的procxalm有效地解决了每个减少的问题。全面的数值实验,包括篮子期权的定价和经济学生产功能的估计,这表明我们提议的近亲胜过最先进的算法,而拟议的加速技术进一步缩短了计算时间。

Shape-constrained convex regression problem deals with fitting a convex function to the observed data, where additional constraints are imposed, such as component-wise monotonicity and uniform Lipschitz continuity. This paper provides a unified framework for computing the least squares estimator of a multivariate shape-constrained convex regression function in $\mathbb{R}^d$. We prove that the least squares estimator is computable via solving an essentially constrained convex quadratic programming (QP) problem with $(d+1)n$ variables, $n(n-1)$ linear inequality constraints and $n$ possibly non-polyhedral inequality constraints, where $n$ is the number of data points. To efficiently solve the generally very large-scale convex QP, we design a proximal augmented Lagrangian method (proxALM) whose subproblems are solved by the semismooth Newton method (SSN). To further accelerate the computation when $n$ is huge, we design a practical implementation of the constraint generation method such that each reduced problem is efficiently solved by our proposed proxALM. Comprehensive numerical experiments, including those in the pricing of basket options and estimation of production functions in economics, demonstrate that our proposed proxALM outperforms the state-of-the-art algorithms, and the proposed acceleration technique further shortens the computation time by a large margin.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源