论文标题
通过触觉反馈扩展了配套的运动意图可视化
Extending Cobot's Motion Intention Visualization by Haptic Feedback
论文作者
论文摘要
如今,在越来越多的领域中发现了机器人,他们与人类紧密合作。这些玉米机器人由轻巧的材料和安全传感器启用,在家庭护理中越来越受欢迎,支持日常生活中身体障碍的人。但是,当配角自动执行动作时,人类合作者理解和预测其行为仍然具有挑战性,这对于实现信任和用户接受至关重要。预测配套行为的一个重要方面是了解他们的运动意图,并理解他们如何“思考”自己的行为。此外,其他信息来源经常占据人类的视觉和音频方式,使它们经常不适合传输此类信息。我们致力于通过触觉反馈来传达COBOT意图的解决方案,以应对这一挑战。在我们的概念中,我们将计划的柯机计划动作映射到不同的触觉模式,以扩展视觉意图的反馈。
Nowadays, robots are found in a growing number of areas where they collaborate closely with humans. Enabled by lightweight materials and safety sensors, these cobots are gaining increasing popularity in domestic care, supporting people with physical impairments in their everyday lives. However, when cobots perform actions autonomously, it remains challenging for human collaborators to understand and predict their behavior, which is crucial for achieving trust and user acceptance. One significant aspect of predicting cobot behavior is understanding their motion intention and comprehending how they "think" about their actions. Moreover, other information sources often occupy human visual and audio modalities, rendering them frequently unsuitable for transmitting such information. We work on a solution that communicates cobot intention via haptic feedback to tackle this challenge. In our concept, we map planned motions of the cobot to different haptic patterns to extend the visual intention feedback.