论文标题
EGOK360:360个以中心的动力学人类活动视频数据集
Egok360: A 360 Egocentric Kinetic Human Activity Video Dataset
论文作者
论文摘要
最近,人们对可穿戴传感器的兴趣越来越大,为360°视频分析提供了新的研究观点。但是,文献中缺乏360°数据集阻碍了该领域的研究。为了弥合这一差距,在本文中,我们提出了一种新颖的自我中心(第一人称)360°动力学人类活动视频数据集(EGOK360)。 EGOK360数据集包含具有不同子行为的人类活动的注释,例如,具有四个拾音器,hit,hit,bounce bouncel-ball and serve的四个子姿势的活动乒乓球。据我们所知,EGOK360是第一人称活动识别领域的第一个数据集,具有360°环境设置,这将有助于以Egecentric 360°的视频理解。我们为360个以中心活动识别的两流网络的变体提供了实验结果和全面分析。可以从https://egok360.github.io/下载EGOK360数据集。
Recently, there has been a growing interest in wearable sensors which provides new research perspectives for 360 ° video analysis. However, the lack of 360 ° datasets in literature hinders the research in this field. To bridge this gap, in this paper we propose a novel Egocentric (first-person) 360° Kinetic human activity video dataset (EgoK360). The EgoK360 dataset contains annotations of human activity with different sub-actions, e.g., activity Ping-Pong with four sub-actions which are pickup-ball, hit, bounce-ball and serve. To the best of our knowledge, EgoK360 is the first dataset in the domain of first-person activity recognition with a 360° environmental setup, which will facilitate the egocentric 360 ° video understanding. We provide experimental results and comprehensive analysis of variants of the two-stream network for 360 egocentric activity recognition. The EgoK360 dataset can be downloaded from https://egok360.github.io/.