论文标题

统一言语情感识别的离散和连续的情感标签

Unifying the Discrete and Continuous Emotion labels for Speech Emotion Recognition

论文作者

Sharma, Roshan, Dhamyal, Hira, Raj, Bhiksha, Singh, Rita

论文摘要

传统上,在语音检测的副语言分析中,情绪已经以离散或维(连续值)标签确定。因此,已提出用于情绪检测的模型使用其中一种或另一种类型的模型。但是,像Russell和Plutchik这样的心理学家提出了结合这些观点的理论和模型,并认为这些表示形式已共享和互补信息。本文试图通过计算来验证这些观点。为此,我们提出了一个模型,以共同预测连续和离散的情感属性,并展示如何利用这些属性来改善情感识别任务的稳健性和性能。我们的方法包括多任务和分层多任务学习框架,它们共同模拟连续值和离散情感标签之间的关系。对基于语音的情感识别的两个广泛使用的数据集(IEMocap和MSPppodcast)的实验结果表明,我们的模型通过非统一方法在强大的基准方面的性能在统计上显着改善。我们还证明,使用一种类型的标签(离散或连续价值)进行训练可以改善使用其他类型标签的任务中的识别性能。还提出了这种方法的实验结果和推理(称为不匹配的训练方法)。

Traditionally, in paralinguistic analysis for emotion detection from speech, emotions have been identified with discrete or dimensional (continuous-valued) labels. Accordingly, models that have been proposed for emotion detection use one or the other of these label types. However, psychologists like Russell and Plutchik have proposed theories and models that unite these views, maintaining that these representations have shared and complementary information. This paper is an attempt to validate these viewpoints computationally. To this end, we propose a model to jointly predict continuous and discrete emotional attributes and show how the relationship between these can be utilized to improve the robustness and performance of emotion recognition tasks. Our approach comprises multi-task and hierarchical multi-task learning frameworks that jointly model the relationships between continuous-valued and discrete emotion labels. Experimental results on two widely used datasets (IEMOCAP and MSPPodcast) for speech-based emotion recognition show that our model results in statistically significant improvements in performance over strong baselines with non-unified approaches. We also demonstrate that using one type of label (discrete or continuous-valued) for training improves recognition performance in tasks that use the other type of label. Experimental results and reasoning for this approach (called the mismatched training approach) are also presented.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源