Header

UZH-Logo

Maintenance Infos

Character-Level Translation with Self-attention


Gao, Yingqiang; Nikolov, Nikola I; Hu, Yuhuang; Hahnloser, Richard HR (2020). Character-Level Translation with Self-attention. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Online, 1 July 2020 - 1 July 2020.

Abstract

We explore the suitability of self-attention models for character-level neural machine translation. We test the standard transformer model, as well as a novel variant in which the encoder block combines information from nearby characters using convolutions. We perform extensive experiments on WMT and UN datasets, testing both bilingual and multilingual translation to English using up to three input languages (French, Spanish, and Chinese). Our transformer variant consistently outperforms the standard transformer at the character-level and converges faster while learning more robust character-level alignments.

Abstract

We explore the suitability of self-attention models for character-level neural machine translation. We test the standard transformer model, as well as a novel variant in which the encoder block combines information from nearby characters using convolutions. We perform extensive experiments on WMT and UN datasets, testing both bilingual and multilingual translation to English using up to three input languages (French, Spanish, and Chinese). Our transformer variant consistently outperforms the standard transformer at the character-level and converges faster while learning more robust character-level alignments.

Statistics

Citations

Altmetrics

Downloads

4 downloads since deposited on 16 Feb 2021
4 downloads since 12 months
Detailed statistics

Additional indexing

Item Type:Conference or Workshop Item (Paper), refereed, original work
Communities & Collections:07 Faculty of Science > Institute of Neuroinformatics
Dewey Decimal Classification:570 Life sciences; biology
Language:English
Event End Date:1 July 2020
Deposited On:16 Feb 2021 09:08
Last Modified:17 Feb 2021 02:52
Publisher:Association for Computational Linguistics
OA Status:Hybrid
Publisher DOI:https://doi.org/10.18653/v1/2020.acl-main.145

Download

Hybrid Open Access

Download PDF  'Character-Level Translation with Self-attention'.
Preview
Content: Published Version
Filetype: PDF
Size: 1MB
View at publisher