Header

UZH-Logo

Maintenance Infos

Inferring user tasks in pedestrian navigation from eye movement data in real-world environments


Liao, Hua; Dong, Weihua; Huang, Haosheng; Gartner, Georg; Liu, Huiping (2019). Inferring user tasks in pedestrian navigation from eye movement data in real-world environments. International Journal of Geographical Information Science, 33(4):739-763.

Abstract

Eye movement data convey a wealth of information that can be used to probe human behaviour and cognitive processes. To date, eye tracking studies have mainly focused on laboratory-based evaluations of cartographic interfaces; in contrast, little attention has been paid to eye movement data mining for real-world applications. In this study, we propose using machine-learning methods to infer user tasks from eye movement data in real-world pedestrian navigation scenarios. We conducted a real-world pedestrian navigation experiment in which we recorded eye movement data from 38 participants. We trained and cross-validated a random forest classifier for classifying five common navigation tasks using five types of eye movement features. The results show that the classifier can achieve an overall accuracy of 67%. We found that statistical eye movement features and saccade encoding features are more useful than the other investigated types of features for distinguishing user tasks. We also identified that the choice of classifier, the time window size and the eye movement features considered are all important factors that influence task inference performance. Results of the research open doors to some potential real-world innovative applications, such as navigation systems that can provide task-related information depending on the task a user is performing.

Abstract

Eye movement data convey a wealth of information that can be used to probe human behaviour and cognitive processes. To date, eye tracking studies have mainly focused on laboratory-based evaluations of cartographic interfaces; in contrast, little attention has been paid to eye movement data mining for real-world applications. In this study, we propose using machine-learning methods to infer user tasks from eye movement data in real-world pedestrian navigation scenarios. We conducted a real-world pedestrian navigation experiment in which we recorded eye movement data from 38 participants. We trained and cross-validated a random forest classifier for classifying five common navigation tasks using five types of eye movement features. The results show that the classifier can achieve an overall accuracy of 67%. We found that statistical eye movement features and saccade encoding features are more useful than the other investigated types of features for distinguishing user tasks. We also identified that the choice of classifier, the time window size and the eye movement features considered are all important factors that influence task inference performance. Results of the research open doors to some potential real-world innovative applications, such as navigation systems that can provide task-related information depending on the task a user is performing.

Statistics

Citations

Dimensions.ai Metrics
30 citations in Web of Science®
43 citations in Scopus®
Google Scholar™

Altmetrics

Downloads

2 downloads since deposited on 18 Dec 2018
0 downloads since 12 months
Detailed statistics

Additional indexing

Item Type:Journal Article, refereed, original work
Communities & Collections:07 Faculty of Science > Institute of Geography
Dewey Decimal Classification:910 Geography & travel
Scopus Subject Areas:Physical Sciences > Information Systems
Social Sciences & Humanities > Geography, Planning and Development
Social Sciences & Humanities > Library and Information Sciences
Uncontrolled Keywords:Geography, Planning and Development, Library and Information Sciences, Information Systems
Language:English
Date:3 April 2019
Deposited On:18 Dec 2018 16:20
Last Modified:26 Jan 2022 19:16
Publisher:Taylor & Francis
ISSN:1365-8816
OA Status:Closed
Publisher DOI:https://doi.org/10.1080/13658816.2018.1482554