论文标题

意图对比度学习以进行顺序推荐

Intent Contrastive Learning for Sequential Recommendation

论文作者

Chen, Yongjun, Liu, Zhiwei, Li, Jia, McAuley, Julian, Xiong, Caiming

论文摘要

用户与物品的互动是由各种意图驱动的(例如,准备节日礼物,购买捕鱼设备等)。但是,用户的潜在意图通常是没有观察到的/潜在的,这使得利用这种潜在的潜在意图进行预期征服(SR)具有挑战性。为了调查潜在意图的好处并有效利用它们来推荐,我们提出了一种通用术语(ICL),这是一种将潜在意图变量利用为SR的一般学习范式。核心思想是通过考虑提高建议的学习意图,从未标记的用户行为序列中学习用户的意图发行功能,并优化具有对比性自我监督学习(SSL)的SR模型。具体来说,我们引入了一个潜在变量来表示用户的意图,并通过群集学习潜在变量的分布功能。我们建议通过对比度SSL将学习意图利用为SR模型,从而最大程度地提高了序列观点及其相应意图之间的一致性。培训是在意图表示学习与SR模型优化步骤之间的交替,在广义期望最大化(EM)框架内。将用户意图融合到SR中也可以提高模型的鲁棒性。在四个现实世界数据集上进行的实验证明了拟议的学习范式的优越性,从而提高了对数据稀疏性和嘈杂相互作用问题的稳健性。

Users' interactions with items are driven by various intents (e.g., preparing for holiday gifts, shopping for fishing equipment, etc.).However, users' underlying intents are often unobserved/latent, making it challenging to leverage such latent intents forSequentialrecommendation(SR). To investigate the benefits of latent intents and leverage them effectively for recommendation, we proposeIntentContrastiveLearning(ICL), a general learning paradigm that leverages a latent intent variable into SR. The core idea is to learn users' intent distribution functions from unlabeled user behavior sequences and optimize SR models with contrastive self-supervised learning (SSL) by considering the learned intents to improve recommendation. Specifically, we introduce a latent variable to represent users' intents and learn the distribution function of the latent variable via clustering. We propose to leverage the learned intents into SR models via contrastive SSL, which maximizes the agreement between a view of sequence and its corresponding intent. The training is alternated between intent representation learning and the SR model optimization steps within the generalized expectation-maximization (EM) framework. Fusing user intent information into SR also improves model robustness. Experiments conducted on four real-world datasets demonstrate the superiority of the proposed learning paradigm, which improves performance, and robustness against data sparsity and noisy interaction issues.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源