Publication:

ScanDL 2.0: A Generative Model of Eye Movements in Reading Synthesizing Scanpaths and Fixation Durations

Date

Date

Date
2025
Journal Article
Published version
cris.lastimport.scopus2025-07-10T03:45:26Z
cris.virtual.orcidhttps://orcid.org/0000-0001-5776-7235
cris.virtualsource.orcidffbd9998-08a4-48c7-bb7d-fd63ba81ecdd
dc.contributor.institutionUniversity of Zurich
dc.date.accessioned2025-07-09T11:28:27Z
dc.date.available2025-07-09T11:28:27Z
dc.date.issued2025-05-22
dc.description.abstract

Eye movements in reading have become a vital tool for investigating the cognitive mechanisms involved in language processing. They are not only used within psycholinguistics but have also been leveraged within the field of NLP to improve the performance of language models on downstream tasks. However, the scarcity of real eye-tracking data and its limited generalizability at inference time present challenges for data-driven approaches. In response, synthetic scanpaths have emerged as a promising alternative. Despite advances, however, existing machine learning-based methods, including the state-of-the-art ScanDL [9], fail to incorporate fixation durations into the generated scanpaths, which are crucial for a complete representation of reading behavior. We therefore propose a novel model, denoted ScanDL 2.0, which synthesizes both fixation locations and durations. It sets a new benchmark in generating human-like synthetic scanpaths, demonstrating superior performance across various evaluation settings. Furthermore, psycholinguistic analyses confirm its ability to emulate key phenomena in human reading. Our code as well as pre-trained model weights are available via https://github.com/DiLi-Lab/ScanDL-2.0.

dc.identifier.doi10.1145/3725830
dc.identifier.issn2573-0142
dc.identifier.scopus2-s2.0-105006785271
dc.identifier.urihttps://www.zora.uzh.ch/handle/20.500.14742/231244
dc.language.isoeng
dc.subject.ddc410 Linguistics
dc.subject.ddc000 Computer science, knowledge & systems
dc.subject.ddc400 Language
dc.title

ScanDL 2.0: A Generative Model of Eye Movements in Reading Synthesizing Scanpaths and Fixation Durations

dc.typearticle
dcterms.accessRightsinfo:eu-repo/semantics/openAccess
dcterms.bibliographicCitation.journaltitleProceedings of the ACM on Human-Computer Interaction
dcterms.bibliographicCitation.number3
dcterms.bibliographicCitation.originalpublishernameACM Digital library
dcterms.bibliographicCitation.pageend29
dcterms.bibliographicCitation.pagestart1
dcterms.bibliographicCitation.volume9
dspace.entity.typePublicationen
uzh.contributor.affiliationUniversity of Zurich
uzh.contributor.affiliationUniversity of Zurich
uzh.contributor.affiliationUniversity of Zurich
uzh.contributor.authorBolliger, Lena S
uzh.contributor.authorReich, David Robert
uzh.contributor.authorJäger, Lena A
uzh.contributor.correspondenceYes
uzh.contributor.correspondenceNo
uzh.contributor.correspondenceNo
uzh.document.availabilitypublished_version
uzh.eprint.datestamp2025-07-09 11:28:27
uzh.eprint.lastmod2025-07-10 20:00:18
uzh.eprint.statusChange2025-07-09 11:28:27
uzh.harvester.ethYes
uzh.harvester.nbNo
uzh.identifier.doi10.5167/uzh-278293
uzh.jdb.eprintsId39585
uzh.oastatus.unpaywallhybrid
uzh.oastatus.zoraHybrid
uzh.publication.citationBolliger, L. S., Reich, D. R., & Jäger, L. A. (2025). ScanDL 2.0: A Generative Model of Eye Movements in Reading Synthesizing Scanpaths and Fixation Durations. Proceedings of the ACM on Human-Computer Interaction, 9(3), 1–29. https://doi.org/10.1145/3725830
uzh.publication.freeAccessAtdoi
uzh.publication.originalworkoriginal
uzh.publication.publishedStatusfinal
uzh.scopus.impact0
uzh.scopus.subjectsSocial Sciences (miscellaneous)
uzh.scopus.subjectsHuman-Computer Interaction
uzh.scopus.subjectsComputer Networks and Communications
uzh.workflow.doajuzh.workflow.doaj.false
uzh.workflow.eprintid278293
uzh.workflow.fulltextStatuspublic
uzh.workflow.revisions18
uzh.workflow.rightsCheckkeininfo
uzh.workflow.sourceCrossref:10.1145/3725830
uzh.workflow.statusarchive
Files

Original bundle

Name:
3725830.pdf
Size:
4.12 MB
Format:
Adobe Portable Document Format
Publication available in collections: