论文标题
比较基于注意力的脑电图分类的深度学习模型
Comparison of Attention-based Deep Learning Models for EEG Classification
论文作者
论文摘要
目的:评估对深度学习(DL)模型中各种注意力机制的脑电图(EEG)分类的影响。方法:我们比较了三种引人注目的DL模型,即全新的Instagats,一个具有注意力的LSTM和具有关注的CNN。我们使用这些模型来对正常和异常(即人为或病理)脑电图模式进行分类。结果:无论数据集的巨大变化以及注意力增强模型的简单架构,我们都在所有分类问题中都达到了最新的状态。我们还可以证明,取决于如何应用注意机制以及注意力层位于模型中的位置,我们可以选择利用数据集的时间,频率或空间域中包含的信息。结论:通过这项工作,我们阐明了不同注意机制在正常和异常脑电图模式分类中的作用。此外,我们讨论了他们如何利用大脑活动的时间,频率和空间域中的内在关系。意义:注意是在不同的现实情况下评估脑电图信息质量及其相关性的一种有希望的策略。此外,它可以使并行化计算并因此加快大型电生理(例如EEG)数据集的分析变得更加容易。
Objective: To evaluate the impact on Electroencephalography (EEG) classification of different kinds of attention mechanisms in Deep Learning (DL) models. Methods: We compared three attention-enhanced DL models, the brand-new InstaGATs, an LSTM with attention and a CNN with attention. We used these models to classify normal and abnormal (i.e., artifactual or pathological) EEG patterns. Results: We achieved the state of the art in all classification problems, regardless the large variability of the datasets and the simple architecture of the attention-enhanced models. We could also prove that, depending on how the attention mechanism is applied and where the attention layer is located in the model, we can alternatively leverage the information contained in the time, frequency or space domain of the dataset. Conclusions: with this work, we shed light over the role of different attention mechanisms in the classification of normal and abnormal EEG patterns. Moreover, we discussed how they can exploit the intrinsic relationships in the temporal, frequency and spatial domains of our brain activity. Significance: Attention represents a promising strategy to evaluate the quality of the EEG information, and its relevance, in different real-world scenarios. Moreover, it can make it easier to parallelize the computation and, thus, to speed up the analysis of big electrophysiological (e.g., EEG) datasets.