Header

UZH-Logo

Maintenance Infos

Negation typology and general representation models for cross-lingual zero-shot negation scope resolution in Russian, French, and Spanish


Shaitarova, Anastassia; Rinaldi, Fabio (2021). Negation typology and general representation models for cross-lingual zero-shot negation scope resolution in Russian, French, and Spanish. In: Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Student Research Workshop, Online, 6 June 2021 - 11 June 2021, ACL Anthology.

Abstract

Negation is a linguistic universal that poses difficulties for cognitive and computational processing. Despite many advances in text analytics, negation resolution remains an acute and continuously researched question in Natural Language Processing. Reliable negation parsing affects results in biomedical text mining, sentiment analysis, machine translation, and many other fields. The availability of multilingual pre-trained general representation models makes it possible to experiment with negation detection in languages that lack annotated data. In this work we test the performance of two state-of-the-art contextual representation models, Multilingual BERT and XLM-RoBERTa. We resolve negation scope by conducting zero-shot transfer between English, Spanish, French, and Russian. Our best result amounts to a token-level F1-score of 86.86% between Spanish and Russian. We correlate these results with a linguistic negation typology and lexical capacity of the models.

Abstract

Negation is a linguistic universal that poses difficulties for cognitive and computational processing. Despite many advances in text analytics, negation resolution remains an acute and continuously researched question in Natural Language Processing. Reliable negation parsing affects results in biomedical text mining, sentiment analysis, machine translation, and many other fields. The availability of multilingual pre-trained general representation models makes it possible to experiment with negation detection in languages that lack annotated data. In this work we test the performance of two state-of-the-art contextual representation models, Multilingual BERT and XLM-RoBERTa. We resolve negation scope by conducting zero-shot transfer between English, Spanish, French, and Russian. Our best result amounts to a token-level F1-score of 86.86% between Spanish and Russian. We correlate these results with a linguistic negation typology and lexical capacity of the models.

Statistics

Citations

Dimensions.ai Metrics

Altmetrics

Downloads

2 downloads since deposited on 25 Aug 2021
2 downloads since 12 months
Detailed statistics

Additional indexing

Item Type:Conference or Workshop Item (Paper), refereed, original work
Communities & Collections:06 Faculty of Arts > Institute of Computational Linguistics
Dewey Decimal Classification:000 Computer science, knowledge & systems
410 Linguistics
Language:English
Event End Date:11 June 2021
Deposited On:25 Aug 2021 10:47
Last Modified:25 Aug 2021 10:47
Publisher:ACL Anthology
OA Status:Hybrid
Free access at:Publisher DOI. An embargo period may apply.
Publisher DOI:https://doi.org/10.18653/v1/2021.naacl-srw.3
Official URL:https://aclanthology.org/2021.naacl-srw.3/

Download

Hybrid Open Access

Download PDF  'Negation typology and general representation models for cross-lingual zero-shot negation scope resolution in Russian, French, and Spanish'.
Preview
Content: Published Version
Filetype: PDF
Size: 303kB
View at publisher
Licence: Creative Commons: Attribution 4.0 International (CC BY 4.0)