论文标题
SIM2REAL实例级样式转移6D姿势估计
Sim2Real Instance-Level Style Transfer for 6D Pose Estimation
论文作者
论文摘要
近年来,合成数据已被广泛用于6D姿势估计网络的培训,部分原因是它会自动以低成本提供完美的注释。但是,在合成数据和真实数据之间,仍然存在非平凡的域间隙,例如纹理/材料的差异。这些差距对性能有可衡量的影响。为了解决此问题,我们将模拟仿真(SIM2REAL)实例级样式转移,以进行6D姿势估计网络培训。我们的方法在没有人类干预的情况下将目标对象的风格分别转移到了真实。这提高了训练姿势估计网络的合成数据质量。我们还提出了一条从数据收集到姿势估计网络培训的完整管道,并在现实世界的机器人平台上进行广泛的评估。我们的评估表明,我们的方法在姿势估计性能和样式转移适应的图像的现实中都取得了重大改进。
In recent years, synthetic data has been widely used in the training of 6D pose estimation networks, in part because it automatically provides perfect annotation at low cost. However, there are still non-trivial domain gaps, such as differences in textures/materials, between synthetic and real data. These gaps have a measurable impact on performance. To solve this problem, we introduce a simulation to reality (sim2real) instance-level style transfer for 6D pose estimation network training. Our approach transfers the style of target objects individually, from synthetic to real, without human intervention. This improves the quality of synthetic data for training pose estimation networks. We also propose a complete pipeline from data collection to the training of a pose estimation network and conduct extensive evaluation on a real-world robotic platform. Our evaluation shows significant improvement achieved by our method in both pose estimation performance and the realism of images adapted by the style transfer.