Header

UZH-Logo

Maintenance Infos

Low-latency visual odometry using event-based feature tracks


Kueng, Beat; Müggler, Elias; Gallego Bonet, Guillermo; Scaramuzza, Davide (2016). Low-latency visual odometry using event-based feature tracks. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, 9 October 2016 - 14 October 2016, 1-8.

Abstract

New vision sensors, such as the Dynamic and Active-pixel Vision sensor (DAVIS), incorporate a conventional camera and an event-based sensor in the same pixel array. These sensors have great potential for robotics because they allow us to combine the benefits of conventional cameras with those of event-based sensors: low latency, high temporal resolution, and high dynamic range. However, new algorithms are required to exploit the sensor characteristics and cope with its unconventional output, which consists of a stream of asynchronous brightness changes (called “events”) and synchronous grayscale frames. In this paper, we present a lowlatency visual odometry algorithm for the DAVIS sensor using event-based feature tracks. Features are first detected in the grayscale frames and then tracked asynchronously using the stream of events. The features are then fed to an event-based visual odometry algorithm that tightly interleaves robust pose optimization and probabilistic mapping. We show that our method successfully tracks the 6-DOF motion of the sensor in natural scenes. This is the first work on event-based visual odometry with the DAVIS sensor using feature tracks.

Abstract

New vision sensors, such as the Dynamic and Active-pixel Vision sensor (DAVIS), incorporate a conventional camera and an event-based sensor in the same pixel array. These sensors have great potential for robotics because they allow us to combine the benefits of conventional cameras with those of event-based sensors: low latency, high temporal resolution, and high dynamic range. However, new algorithms are required to exploit the sensor characteristics and cope with its unconventional output, which consists of a stream of asynchronous brightness changes (called “events”) and synchronous grayscale frames. In this paper, we present a lowlatency visual odometry algorithm for the DAVIS sensor using event-based feature tracks. Features are first detected in the grayscale frames and then tracked asynchronously using the stream of events. The features are then fed to an event-based visual odometry algorithm that tightly interleaves robust pose optimization and probabilistic mapping. We show that our method successfully tracks the 6-DOF motion of the sensor in natural scenes. This is the first work on event-based visual odometry with the DAVIS sensor using feature tracks.

Statistics

Citations

8 citations in Scopus®
10 citations in Microsoft Academic
Google Scholar™

Downloads

34 downloads since deposited on 12 Aug 2016
18 downloads since 12 months
Detailed statistics

Additional indexing

Item Type:Conference or Workshop Item (Paper), refereed, original work
Communities & Collections:03 Faculty of Economics > Department of Informatics
Dewey Decimal Classification:000 Computer science, knowledge & systems
Language:English
Event End Date:14 October 2016
Deposited On:12 Aug 2016 08:51
Last Modified:02 Feb 2018 10:15
Publisher:Institute of Electrical and Electronics Engineers (IEEE)
OA Status:Green
Free access at:Official URL. An embargo period may apply.
Official URL:http://rpg.ifi.uzh.ch/docs/IROS16_Kueng.pdf
Related URLs:http://www.iros2016.org/ (Organisation)
http://rpg.ifi.uzh.ch/research_dvs.html (Author)
Other Identification Number:merlin-id:13507

Download

Download PDF  'Low-latency visual odometry using event-based feature tracks'.
Preview
Content: Accepted Version
Filetype: PDF
Size: 1MB