论文标题

通过关联剪辑跟踪

Tracking by Associating Clips

论文作者

Woo, Sanghyun, Park, Kwanyong, Oh, Seoung Wug, Kweon, In So, Lee, Joon-Young

论文摘要

如今,按检测范式跟踪已成为多对象跟踪的主要方法,并通过检测每个帧中的对象,然后在跨帧中执行数据关联来起作用。但是,其顺序构架匹配属性从根本上遭受了视频中的中间干扰,例如对象遮挡,快速相机运动和突然的光更改。此外,它通常会忽略两个帧以外的时间信息以匹配。在本文中,我们通过将对象关联视为夹子匹配来研究替代方案。我们的新透视图将单个长视频序列视为多个短剪辑,然后在剪辑之间和之间执行跟踪。这种新方法的好处是两个折。首先,我们的方法可以强大地跟踪误差积累或传播,因为视频块允许绕过中断的帧,并且短剪辑跟踪避免了常规的易用错误的长期跟踪记忆管理。其次,在剪辑匹配期间汇总了多个帧信息,从而使比当前框架匹配更准确地延长了远程轨道关联。鉴于最新的逐个检测跟踪器QDTrack,我们展示了跟踪性能如何通过新的跟踪公式改善。我们评估了具有互补特征和彼此挑战的两个跟踪基准的TAO和MOT17的建议。

The tracking-by-detection paradigm today has become the dominant method for multi-object tracking and works by detecting objects in each frame and then performing data association across frames. However, its sequential frame-wise matching property fundamentally suffers from the intermediate interruptions in a video, such as object occlusions, fast camera movements, and abrupt light changes. Moreover, it typically overlooks temporal information beyond the two frames for matching. In this paper, we investigate an alternative by treating object association as clip-wise matching. Our new perspective views a single long video sequence as multiple short clips, and then the tracking is performed both within and between the clips. The benefits of this new approach are two folds. First, our method is robust to tracking error accumulation or propagation, as the video chunking allows bypassing the interrupted frames, and the short clip tracking avoids the conventional error-prone long-term track memory management. Second, the multiple frame information is aggregated during the clip-wise matching, resulting in a more accurate long-range track association than the current frame-wise matching. Given the state-of-the-art tracking-by-detection tracker, QDTrack, we showcase how the tracking performance improves with our new tracking formulation. We evaluate our proposals on two tracking benchmarks, TAO and MOT17 that have complementary characteristics and challenges each other.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源