Header

UZH-Logo

Maintenance Infos

Event-based, 6-DOF pose tracking for high-speed maneuvers


Müggler, Elias; Huber, Basil; Scaramuzza, Davide (2014). Event-based, 6-DOF pose tracking for high-speed maneuvers. In: IEEE/RSJ Intl. Conf. on Intelligent Robots and Systems (IROS), Chicago, IL, USA, 14 September 2014 - 18 September 2014, 2761-2768.

Abstract

In the last few years, we have witnessed impressive demonstrations of aggressive flights and acrobatics using quadrotors. However, those robots are actually blind. They do not see by themselves, but through the “eyes” of an external motion capture system. Flight maneuvers using onboard sensors are still slow compared to those attainable with motion capture systems. At the current state, the agility of a robot is limited by the latency of its perception pipeline. To obtain more agile robots, we need to use faster sensors. In this paper, we present the first onboard perception system for 6-DOF localization during high-speed maneuvers using a Dynamic Vision Sensor (DVS). Unlike a standard CMOS camera, a DVS does not wastefully send full image frames at a fixed frame rate. Conversely, similar to the human eye, it only transmits pixel-level brightness changes at the time they occur with microsecond resolution, thus, offering the possibility to create a perception pipeline whose latency is negligible compared to the dynamics of the robot. We exploit these characteristics to estimate the pose of a quadrotor with respect to a known pattern during high-speed maneuvers, such as flips, with rotational speeds up to 1,200°/s. Additionally, we provide a versatile method to capture ground-truth data using a DVS.

Abstract

In the last few years, we have witnessed impressive demonstrations of aggressive flights and acrobatics using quadrotors. However, those robots are actually blind. They do not see by themselves, but through the “eyes” of an external motion capture system. Flight maneuvers using onboard sensors are still slow compared to those attainable with motion capture systems. At the current state, the agility of a robot is limited by the latency of its perception pipeline. To obtain more agile robots, we need to use faster sensors. In this paper, we present the first onboard perception system for 6-DOF localization during high-speed maneuvers using a Dynamic Vision Sensor (DVS). Unlike a standard CMOS camera, a DVS does not wastefully send full image frames at a fixed frame rate. Conversely, similar to the human eye, it only transmits pixel-level brightness changes at the time they occur with microsecond resolution, thus, offering the possibility to create a perception pipeline whose latency is negligible compared to the dynamics of the robot. We exploit these characteristics to estimate the pose of a quadrotor with respect to a known pattern during high-speed maneuvers, such as flips, with rotational speeds up to 1,200°/s. Additionally, we provide a versatile method to capture ground-truth data using a DVS.

Statistics

Citations

8 citations in Web of Science®
18 citations in Scopus®
Google Scholar™

Altmetrics

Downloads

11 downloads since deposited on 12 Aug 2016
10 downloads since 12 months
Detailed statistics

Additional indexing

Item Type:Conference or Workshop Item (Paper), refereed, original work
Communities & Collections:03 Faculty of Economics > Department of Informatics
Dewey Decimal Classification:000 Computer science, knowledge & systems
Language:English
Event End Date:18 September 2014
Deposited On:12 Aug 2016 08:51
Last Modified:30 Jan 2017 08:34
Publisher:Institute of Electrical and Electronics Engineers
Series Name:IEEE International Conference on Intelligent Robots and Systems. Proceedings
ISSN:2153-0858
Additional Information:© 2014 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
Publisher DOI:https://doi.org/10.1109/IROS.2014.6942940
Related URLs:http://www.iros2014.org/ (Organisation)
Other Identification Number:merlin-id:10197

Download

Preview Icon on Download
Preview
Content: Accepted Version
Filetype: PDF
Size: 837kB
View at publisher

TrendTerms

TrendTerms displays relevant terms of the abstract of this publication and related documents on a map. The terms and their relations were extracted from ZORA using word statistics. Their timelines are taken from ZORA as well. The bubble size of a term is proportional to the number of documents where the term occurs. Red, orange, yellow and green colors are used for terms that occur in the current document; red indicates high interlinkedness of a term with other terms, orange, yellow and green decreasing interlinkedness. Blue is used for terms that have a relation with the terms in this document, but occur in other documents.
You can navigate and zoom the map. Mouse-hovering a term displays its timeline, clicking it yields the associated documents.

Author Collaborations