论文标题
半监督时间序列分类的自我监督对比表示学习
Self-supervised Contrastive Representation Learning for Semi-supervised Time-Series Classification
论文作者
论文摘要
学习时间序列表示只有未标记的数据或几个标签样本可用时,可能是一项具有挑战性的任务。最近,通过对比的不同数据,对比的自我监督学习在从未标记的数据中提取有用的表示方面显示出了很大的改进。在这项工作中,我们通过时间和上下文对比(TS-TCC)提出了一个新颖的时间序列表示学习框架,该框架从未标记的数据中学习了用对比度学习的图表。具体而言,我们建议时间序列特定的弱和强大的增强,并利用他们的观点在拟议的时间对比模块中学习牢固的时间关系,除了通过我们提出的上下文对比模块学习判别性表示。此外,我们对时间序列数据增强选择进行系统研究,这是对比度学习的关键部分。我们还将TS-TCC扩展到了半监督的学习设置,并提出了一个吸引的TS-TCC(CA-TCC),该ts-TCC(CA-TCC)受益于可用的少数标记数据,以进一步改善TS-TCC学到的表示。具体而言,我们利用TS-TCC生成的可靠伪标签来实现班级感知的对比损失。广泛的实验表明,对我们所提出的框架所学的功能的线性评估与完全监督的培训相当。此外,我们的框架在少数标记的数据和转移学习方案中显示出很高的效率。该代码可在\ url {https://github.com/emadeldeen24/ca-tcc}上公开获得。
Learning time-series representations when only unlabeled data or few labeled samples are available can be a challenging task. Recently, contrastive self-supervised learning has shown great improvement in extracting useful representations from unlabeled data via contrasting different augmented views of data. In this work, we propose a novel Time-Series representation learning framework via Temporal and Contextual Contrasting (TS-TCC) that learns representations from unlabeled data with contrastive learning. Specifically, we propose time-series-specific weak and strong augmentations and use their views to learn robust temporal relations in the proposed temporal contrasting module, besides learning discriminative representations by our proposed contextual contrasting module. Additionally, we conduct a systematic study of time-series data augmentation selection, which is a key part of contrastive learning. We also extend TS-TCC to the semi-supervised learning settings and propose a Class-Aware TS-TCC (CA-TCC) that benefits from the available few labeled data to further improve representations learned by TS-TCC. Specifically, we leverage the robust pseudo labels produced by TS-TCC to realize a class-aware contrastive loss. Extensive experiments show that the linear evaluation of the features learned by our proposed framework performs comparably with the fully supervised training. Additionally, our framework shows high efficiency in the few labeled data and transfer learning scenarios. The code is publicly available at \url{https://github.com/emadeldeen24/CA-TCC}.