Header

UZH-Logo

Maintenance Infos

Inferring user tasks in pedestrian navigation from eye movement data in real-world environments


Liao, Hua; Dong, Weihua; Huang, Haosheng; Gartner, Georg; Liu, Huiping (2019). Inferring user tasks in pedestrian navigation from eye movement data in real-world environments. International Journal of Geographical Information Science, 33(4):739-763.

Abstract

Eye movement data convey a wealth of information that can be used to probe human behaviour and cognitive processes. To date, eye tracking studies have mainly focused on laboratory-based evalua- tions of cartographic interfaces; in contrast, little attention has been paid to eye movement data mining for real-world applications. In this study, we propose using machine-learning methods to infer user tasks from eye movement data in real-world pedestrian navigation scenarios. We conducted a real-world pedestrian navigation experi- ment in which we recorded eye movement data from 38 participants. We trained and cross-validated a random forest classifier for classify- ing five common navigation tasks using five types of eye movement features. The results show that the classifier can achieve an overall accuracy of 67%. We found that statistical eye movement features and saccade encoding features are more useful than the other investigated types of features for distinguishing user tasks. We also identified that the choice of classifier, the time window size and the eye movement features considered are all important factors that influence task inference performance. Results of the research open doors to some potential real-world innovative applications, such as navigation systems that can provide task-related information depending on the task a user is performing.

Abstract

Eye movement data convey a wealth of information that can be used to probe human behaviour and cognitive processes. To date, eye tracking studies have mainly focused on laboratory-based evalua- tions of cartographic interfaces; in contrast, little attention has been paid to eye movement data mining for real-world applications. In this study, we propose using machine-learning methods to infer user tasks from eye movement data in real-world pedestrian navigation scenarios. We conducted a real-world pedestrian navigation experi- ment in which we recorded eye movement data from 38 participants. We trained and cross-validated a random forest classifier for classify- ing five common navigation tasks using five types of eye movement features. The results show that the classifier can achieve an overall accuracy of 67%. We found that statistical eye movement features and saccade encoding features are more useful than the other investigated types of features for distinguishing user tasks. We also identified that the choice of classifier, the time window size and the eye movement features considered are all important factors that influence task inference performance. Results of the research open doors to some potential real-world innovative applications, such as navigation systems that can provide task-related information depending on the task a user is performing.

Statistics

Citations

Dimensions.ai Metrics
1 citation in Web of Science®
2 citations in Scopus®
Google Scholar™

Altmetrics

Downloads

2 downloads since deposited on 18 Dec 2018
2 downloads since 12 months
Detailed statistics

Additional indexing

Item Type:Journal Article, refereed, original work
Communities & Collections:07 Faculty of Science > Institute of Geography
Dewey Decimal Classification:910 Geography & travel
Uncontrolled Keywords:Geography, Planning and Development, Library and Information Sciences, Information Systems
Language:English
Date:3 April 2019
Deposited On:18 Dec 2018 16:20
Last Modified:31 Jan 2019 02:05
Publisher:Taylor & Francis
ISSN:1365-8816
OA Status:Closed
Publisher DOI:https://doi.org/10.1080/13658816.2018.1482554

Download