Header

UZH-Logo

Maintenance Infos

Event-based, Direct Camera Tracking from a Photometric 3D Map using Nonlinear Optimization


Bryner, Samuel; Gallego, Guillermo; Rebecq, Henri; Scaramuzza, Davide (2019). Event-based, Direct Camera Tracking from a Photometric 3D Map using Nonlinear Optimization. In: 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20 June 2019 - 24 June 2019, 325-331.

Abstract

Event cameras are novel bio-inspired vision sensors that output pixel-level intensity changes, called “events”, instead of traditional video images. These asynchronous sensors naturally respond to motion in the scene with very low latency (microseconds) and have a very high dynamic range. These features, along with a very low power consumption, make event cameras an ideal sensor for fast robot localization and wearable applications, such as AR/VR and gaming. Considering these applications, we present a method to track the 6-DOF pose of an event camera in a known environment, which we contemplate to be described by a photometric 3D map (i.e., intensity plus depth information) built via classic dense 3D reconstruction algorithms. Our approach uses the raw events, directly, without intermediate features, within a maximum-likelihood framework to estimate the camera motion that best explains the events via a generative model. We successfully evaluate the method using both simulated and real data, and show improved results over the state of the art. We release the datasets to the public to foster reproducibility and research in this topic.

Abstract

Event cameras are novel bio-inspired vision sensors that output pixel-level intensity changes, called “events”, instead of traditional video images. These asynchronous sensors naturally respond to motion in the scene with very low latency (microseconds) and have a very high dynamic range. These features, along with a very low power consumption, make event cameras an ideal sensor for fast robot localization and wearable applications, such as AR/VR and gaming. Considering these applications, we present a method to track the 6-DOF pose of an event camera in a known environment, which we contemplate to be described by a photometric 3D map (i.e., intensity plus depth information) built via classic dense 3D reconstruction algorithms. Our approach uses the raw events, directly, without intermediate features, within a maximum-likelihood framework to estimate the camera motion that best explains the events via a generative model. We successfully evaluate the method using both simulated and real data, and show improved results over the state of the art. We release the datasets to the public to foster reproducibility and research in this topic.

Statistics

Citations

Dimensions.ai Metrics

Altmetrics

Downloads

4 downloads since deposited on 26 Jan 2021
4 downloads since 12 months
Detailed statistics

Additional indexing

Item Type:Conference or Workshop Item (Paper), refereed, original work
Communities & Collections:03 Faculty of Economics > Department of Informatics
Dewey Decimal Classification:000 Computer science, knowledge & systems
Scopus Subject Areas:Physical Sciences > Software
Physical Sciences > Control and Systems Engineering
Physical Sciences > Artificial Intelligence
Physical Sciences > Electrical and Electronic Engineering
Language:English
Event End Date:24 June 2019
Deposited On:26 Jan 2021 10:48
Last Modified:27 Jan 2021 21:02
Publisher:IEEE
ISBN:978-1-5386-6027-0
OA Status:Green
Publisher DOI:https://doi.org/10.1109/icra.2019.8794255
Other Identification Number:merlin-id:20285

Download

Green Open Access

Download PDF  'Event-based, Direct Camera Tracking from a Photometric 3D Map using Nonlinear Optimization'.
Preview
Content: Accepted Version
Filetype: PDF
Size: 2MB
View at publisher