论文标题
量子正规化最小二乘
Quantum Regularized Least Squares
论文作者
论文摘要
线性回归是一种拟合线性模型的广泛使用的技术,并在机器学习和统计等不同领域找到了广泛的应用程序。但是,在大多数实际情况下,线性回归问题通常是不适合的,或者基础模型遭受过度拟合,导致错误或琐碎的解决方案。通常通过添加额外的约束(称为正则化)来解决这一问题。在本文中,我们使用块编码和量子奇异值转换(QSVT)的框架来设计具有一般$ \ ell_2 $ regularization的量子最小二乘的第一个量子算法。其中包括量子普通最小二乘,量子加权最小二乘和量子最小二乘的正则版本。我们的量子算法在量子脊回归的先前结果(条件数量的多项式改善和准确性的指数提高)方面显着改善,这是我们结果的一种特殊情况。 为此,我们假设基础矩阵的近似块编码为输入,并使用可靠的QSVT算法进行各种线性代数操作。特别是,我们使用QSVT开发了用于矩阵反转的可变时间量子算法,在此我们使用量子特征值区分作为子例程而不是间隙相位估计。这样可以确保与先前的结果相比,此过程所需的acncilla量子比大幅少得多。由于块编码框架的一般性,我们的算法适用于各种输入模型,也可以看作是标准(非规范化的)量子最小二乘算法的改进和广义版本的先验结果。
Linear regression is a widely used technique to fit linear models and finds widespread applications across different areas such as machine learning and statistics. In most real-world scenarios, however, linear regression problems are often ill-posed or the underlying model suffers from overfitting, leading to erroneous or trivial solutions. This is often dealt with by adding extra constraints, known as regularization. In this paper, we use the frameworks of block-encoding and quantum singular value transformation (QSVT) to design the first quantum algorithms for quantum least squares with general $\ell_2$-regularization. These include regularized versions of quantum ordinary least squares, quantum weighted least squares, and quantum generalized least squares. Our quantum algorithms substantially improve upon prior results on quantum ridge regression (polynomial improvement in the condition number and an exponential improvement in accuracy), which is a particular case of our result. To this end, we assume approximate block-encodings of the underlying matrices as input and use robust QSVT algorithms for various linear algebra operations. In particular, we develop a variable-time quantum algorithm for matrix inversion using QSVT, where we use quantum eigenvalue discrimination as a subroutine instead of gapped phase estimation. This ensures that substantially fewer ancilla qubits are required for this procedure than prior results. Owing to the generality of the block-encoding framework, our algorithms are applicable to a variety of input models and can also be seen as improved and generalized versions of prior results on standard (non-regularized) quantum least squares algorithms.