Header

UZH-Logo

Maintenance Infos

Event-Based Vision Meets Deep Learning on Steering Prediction for Self-Driving Cars


Maqueda, Ana I.; Loquercio, Antonio; Gallego, Guillermo; Garcia, Narciso; Scaramuzza, Davide (2018). Event-Based Vision Meets Deep Learning on Steering Prediction for Self-Driving Cars. In: 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Salt Lake City, UT, 18 July 2018 - 23 July 2018, 5419-5427.

Abstract

Event cameras are bio-inspired vision sensors that naturally
capture the dynamics of a scene, filtering out redundant
information. This paper presents a deep neural
network approach that unlocks the potential of event cameras
on a challenging motion-estimation task: prediction
of a vehicle’s steering angle. To make the best out of this
sensor–algorithm combination, we adapt state-of-the-art
convolutional architectures to the output of event sensors
and extensively evaluate the performance of our approach
on a publicly available large scale event-camera dataset
(1000 km). We present qualitative and quantitative explanations
of why event cameras allow robust steering prediction
even in cases where traditional cameras fail, e.g. challenging
illumination conditions and fast motion. Finally, we
demonstrate the advantages of leveraging transfer learning
from traditional to event-based vision, and show that our
approach outperforms state-of-the-art algorithms based on
standard cameras.

Abstract

Event cameras are bio-inspired vision sensors that naturally
capture the dynamics of a scene, filtering out redundant
information. This paper presents a deep neural
network approach that unlocks the potential of event cameras
on a challenging motion-estimation task: prediction
of a vehicle’s steering angle. To make the best out of this
sensor–algorithm combination, we adapt state-of-the-art
convolutional architectures to the output of event sensors
and extensively evaluate the performance of our approach
on a publicly available large scale event-camera dataset
(1000 km). We present qualitative and quantitative explanations
of why event cameras allow robust steering prediction
even in cases where traditional cameras fail, e.g. challenging
illumination conditions and fast motion. Finally, we
demonstrate the advantages of leveraging transfer learning
from traditional to event-based vision, and show that our
approach outperforms state-of-the-art algorithms based on
standard cameras.

Statistics

Citations

Dimensions.ai Metrics
65 citations in Web of Science®
109 citations in Scopus®
Google Scholar™

Altmetrics

Downloads

43 downloads since deposited on 29 Oct 2019
31 downloads since 12 months
Detailed statistics

Additional indexing

Item Type:Conference or Workshop Item (Paper), refereed, original work
Communities & Collections:03 Faculty of Economics > Department of Informatics
Dewey Decimal Classification:000 Computer science, knowledge & systems
Scopus Subject Areas:Physical Sciences > Software
Physical Sciences > Computer Vision and Pattern Recognition
Language:English
Event End Date:23 July 2018
Deposited On:29 Oct 2019 14:37
Last Modified:27 Nov 2020 07:32
Publisher:IEEE
ISBN:978-1-5386-6420-9
OA Status:Green
Publisher DOI:https://doi.org/10.1109/cvpr.2018.00568
Official URL:http://rpg.ifi.uzh.ch/docs/CVPR18_Maqueda.pdf
Other Identification Number:merlin-id:18682

Download

Green Open Access

Download PDF  'Event-Based Vision Meets Deep Learning on Steering Prediction for Self-Driving Cars'.
Preview
Content: Accepted Version
Filetype: PDF
Size: 3MB
View at publisher