Header

UZH-Logo

Maintenance Infos

A machine learning approach to visual perception of forest trails for mobile robots


Giusti, Alessandro; Guzzi, Jerome; Cireşan, Dan C; He, Fang-Lin; Rodriguez, Juan Pablo; Fontana, Flavio; Fässler, Matthias; Forster, Christian; Schmidhuber, Jurgen; Di Caro, Gianni; Scaramuzza, Davide; Gambardella, Luca M (2016). A machine learning approach to visual perception of forest trails for mobile robots. IEEE Robotics and Automation Letters, 1(2):661-667.

Abstract

We study the problem of perceiving forest or mountain trails from a single monocular image acquired from the viewpoint of a robot traveling on the trail itself. Previous literature focused on trail segmentation, and used low-level features such as image saliency or appearance contrast; we propose a different approach based on a deep neural network used as a supervised image classifier. By operating on the whole image at once, our system outputs the main direction of the trail compared to the viewing direction. Qualitative and quantitative results computed on a large real-world dataset (which we provide for download) show that our approach outperforms alternatives, and yields an accuracy comparable to the accuracy of humans that are tested on the same image classification task. Preliminary results on using this information for quadrotor control in unseen trails are reported. To the best of our knowledge, this is the first letter that describes an approach to perceive forest trials, which is demonstrated on a quadrotor micro aerial vehicle.

Abstract

We study the problem of perceiving forest or mountain trails from a single monocular image acquired from the viewpoint of a robot traveling on the trail itself. Previous literature focused on trail segmentation, and used low-level features such as image saliency or appearance contrast; we propose a different approach based on a deep neural network used as a supervised image classifier. By operating on the whole image at once, our system outputs the main direction of the trail compared to the viewing direction. Qualitative and quantitative results computed on a large real-world dataset (which we provide for download) show that our approach outperforms alternatives, and yields an accuracy comparable to the accuracy of humans that are tested on the same image classification task. Preliminary results on using this information for quadrotor control in unseen trails are reported. To the best of our knowledge, this is the first letter that describes an approach to perceive forest trials, which is demonstrated on a quadrotor micro aerial vehicle.

Statistics

Citations

Dimensions.ai Metrics

469 citations in Scopus®
Google Scholar™

Altmetrics

Downloads

2140 downloads since deposited on 12 Aug 2016
418 downloads since 12 months
Detailed statistics

Additional indexing

Item Type:Journal Article, refereed, original work
Communities & Collections:03 Faculty of Economics > Department of Informatics
Dewey Decimal Classification:000 Computer science, knowledge & systems
Scopus Subject Areas:Physical Sciences > Control and Systems Engineering
Physical Sciences > Biomedical Engineering
Physical Sciences > Human-Computer Interaction
Physical Sciences > Mechanical Engineering
Physical Sciences > Computer Vision and Pattern Recognition
Physical Sciences > Computer Science Applications
Physical Sciences > Control and Optimization
Physical Sciences > Artificial Intelligence
Uncontrolled Keywords:autonomous aerial vehicles, helicopters, image classification, learning (artificial intelligence), microrobots, neural nets, robot vision, deep-neural network, forest trails, machine learning approach, mobile robots, monocular image, quadrotor microaerial vehicle control, qualitative analysis, quantitative analysis, supervised image classifier, viewing direction, visual perception, Cameras, Image segmentation, Mobile robots, Roads, Robot vision systems, Visual perception, Aerial Robotics, Deep Learning, Machine Learning, Visual-Based Navigation
Language:English
Date:17 December 2016
Deposited On:12 Aug 2016 06:00
Last Modified:26 Jan 2022 09:49
Publisher:Institute of Electrical and Electronics Engineers
ISSN:2377-3766
Additional Information:© 2015 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
OA Status:Green
Publisher DOI:https://doi.org/10.1109/LRA.2015.2509024
Related URLs:http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7358076&isnumber=7419970 (Publisher)
Other Identification Number:merlin-id:12929
Project Information:
  • : FunderSNSF
  • : Grant ID200020_140399
  • : Project TitleSupervised Deep / Recurrent Nets
  • Content: Accepted Version