Navigation auf zora.uzh.ch

Search ZORA

ZORA (Zurich Open Repository and Archive)

Character-Level Translation with Self-attention

Gao, Yingqiang; Nikolov, Nikola I; Hu, Yuhuang; Hahnloser, Richard HR (2020). Character-Level Translation with Self-attention. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Online, 1 July 2020, Association for Computational Linguistics.

Abstract

We explore the suitability of self-attention models for character-level neural machine translation. We test the standard transformer model, as well as a novel variant in which the encoder block combines information from nearby characters using convolutions. We perform extensive experiments on WMT and UN datasets, testing both bilingual and multilingual translation to English using up to three input languages (French, Spanish, and Chinese). Our transformer variant consistently outperforms the standard transformer at the character-level and converges faster while learning more robust character-level alignments.

Additional indexing

Item Type:Conference or Workshop Item (Paper), refereed, original work
Communities & Collections:07 Faculty of Science > Institute of Neuroinformatics
Special Collections > Centers of Competence > Center for the Interdisciplinary Study of Language Evolution
Dewey Decimal Classification:570 Life sciences; biology
Language:English
Event End Date:1 July 2020
Deposited On:16 Feb 2021 09:08
Last Modified:12 May 2022 15:10
Publisher:Association for Computational Linguistics
OA Status:Hybrid
Publisher DOI:https://doi.org/10.18653/v1/2020.acl-main.145
Download PDF  'Character-Level Translation with Self-attention'.
Preview
  • Content: Published Version

Metadata Export

Statistics

Citations

Dimensions.ai Metrics
12 citations in Web of Science®
20 citations in Scopus®
Google Scholar™

Altmetrics

Downloads

24 downloads since deposited on 16 Feb 2021
1 download since 12 months
Detailed statistics

Authors, Affiliations, Collaborations

Similar Publications