论文标题

当Bert符合异质计算中文本分类的量子时间卷积学习时

When BERT Meets Quantum Temporal Convolution Learning for Text Classification in Heterogeneous Computing

论文作者

Yang, Chao-Han Huck, Qi, Jun, Chen, Samuel Yen-Chi, Tsao, Yu, Chen, Pin-Yu

论文摘要

量子计算的快速开发表明了量子优势的许多独特特征,例如更丰富的特征表示和对模型参数的更安全保护。这项工作提出了一种基于变异量子电路的垂直联合学习体系结构,以证明用于文本分类的量子增强的预训练的BERT模型的竞争性能。特别是,我们提出的混合经典量词模型由一种新型的随机量子卷积(QTC)学习框架组成,该框架取代了基于BERT的解码器中的某些层。我们对意图分类的实验表明,我们提出的BERT-QTC模型在ships和atis语言数据集中获得了竞争性实验结果。特别是,BERT-QTC在两个文本分类数据集中将现有基于量子电路的语言模型的性能提高了1.57%和1.52%的相对改进。此外,BERT-QTC可以在现有的可访问量子计算硬件和基于CPU的界面上可行地部署,以确保数据隔离。

The rapid development of quantum computing has demonstrated many unique characteristics of quantum advantages, such as richer feature representation and more secured protection on model parameters. This work proposes a vertical federated learning architecture based on variational quantum circuits to demonstrate the competitive performance of a quantum-enhanced pre-trained BERT model for text classification. In particular, our proposed hybrid classical-quantum model consists of a novel random quantum temporal convolution (QTC) learning framework replacing some layers in the BERT-based decoder. Our experiments on intent classification show that our proposed BERT-QTC model attains competitive experimental results in the Snips and ATIS spoken language datasets. Particularly, the BERT-QTC boosts the performance of the existing quantum circuit-based language model in two text classification datasets by 1.57% and 1.52% relative improvements. Furthermore, BERT-QTC can be feasibly deployed on both existing commercial-accessible quantum computation hardware and CPU-based interface for ensuring data isolation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源