论文标题

移动设备上基于事件的计算机视觉的框架

A Framework for Event-based Computer Vision on a Mobile Device

论文作者

Lenz, Gregor, Picaud, Serge, Ieng, Sio-Hoi

论文摘要

我们提出了第一个公开可用的Android框架,将数据直接从事件摄像机传输到手机。当今的移动设备比以往任何时候都处理了更多的工作负载,并且它们结合了越来越多的传感器,使设备更聪明,更友好,更安全。传统的摄像机尤其在此类任务中起着核心作用,但是它们无法连续记录,因为记录的冗余信息的数量在过程中成本很高。另一方面,以生物为灵感的事件摄像机仅记录视觉场景中的变化,并显示了有希望的低功率应用程序,这些应用程序特别适合移动任务,例如面部检测,手势识别或凝视跟踪。我们的原型设备是将这种事件摄像头嵌入电池供电的手持设备中的第一步。移动框架使我们能够实时流式传输事件,并为手机上的始终开机和按需感测的可能性打开了可能性。为了通过同步von Neumann硬件与异步事件摄像机输出联系,我们研究了如何在批处理中进行缓冲事件和处理它们可以使移动应用程序受益。我们根据潜伏期和吞吐量评估我们的框架,并显示了计算机视觉任务的示例,这些任务涉及事件和预先训练的神经网络方法,以识别手势,孔径可靠的光流和灰级图像重建事件。该代码可在https://github.com/neuromorphic-paris/frog上找到

We present the first publicly available Android framework to stream data from an event camera directly to a mobile phone. Today's mobile devices handle a wider range of workloads than ever before and they incorporate a growing gamut of sensors that make devices smarter, more user friendly and secure. Conventional cameras in particular play a central role in such tasks, but they cannot record continuously, as the amount of redundant information recorded is costly to process. Bio-inspired event cameras on the other hand only record changes in a visual scene and have shown promising low-power applications that specifically suit mobile tasks such as face detection, gesture recognition or gaze tracking. Our prototype device is the first step towards embedding such an event camera into a battery-powered handheld device. The mobile framework allows us to stream events in real-time and opens up the possibilities for always-on and on-demand sensing on mobile phones. To liaise the asynchronous event camera output with synchronous von Neumann hardware, we look at how buffering events and processing them in batches can benefit mobile applications. We evaluate our framework in terms of latency and throughput and show examples of computer vision tasks that involve both event-by-event and pre-trained neural network methods for gesture recognition, aperture robust optical flow and grey-level image reconstruction from events. The code is available at https://github.com/neuromorphic-paris/frog

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源