论文标题
使用Cyclean和力控制,将机器人组件与机器人组件进行视觉输入转移
Sim-to-Real Transfer of Robotic Assembly with Visual Inputs Using CycleGAN and Force Control
论文作者
论文摘要
最近,深度加固学习(RL)在机器人操作应用中表现出了一些令人印象深刻的成功。但是,由于样本效率和安全性问题,现实世界中的培训机器人是不平凡的。提出了SIM到现实的转移来解决上述问题,但引入了一个名为“现实差距”的新问题。在这项工作中,我们为基于视觉的组装任务介绍了一个模拟式学习框架,并通过采用单个摄像头的输入来解决上述问题,并在模拟环境中进行培训。我们提出了一种基于循环一致的生成对抗网络(CycleGAN)的域适应方法,以及一种弥合现实差距的力控制转移方法。我们证明,在模拟环境中训练有训练的拟议框架可以成功地转移到真实的孔洞设置中。
Recently, deep reinforcement learning (RL) has shown some impressive successes in robotic manipulation applications. However, training robots in the real world is nontrivial owing to sample efficiency and safety concerns. Sim-to-real transfer is proposed to address the aforementioned concerns but introduces a new issue called the reality gap. In this work, we introduce a sim-to-real learning framework for vision-based assembly tasks and perform training in a simulated environment by employing inputs from a single camera to address the aforementioned issues. We present a domain adaptation method based on cycle-consistent generative adversarial networks (CycleGAN) and a force control transfer approach to bridge the reality gap. We demonstrate that the proposed framework trained in a simulated environment can be successfully transferred to a real peg-in-hole setup.