Publication: Character-Level Translation with Self-attention
Character-Level Translation with Self-attention
Date
Date
Date
Citations
Gao, Y., Nikolov, N. I., Hu, Y., & Hahnloser, R. H. (2020, July 1). Character-Level Translation with Self-attention. Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Online. https://doi.org/10.18653/v1/2020.acl-main.145
Abstract
Abstract
Abstract
We explore the suitability of self-attention models for character-level neural machine translation. We test the standard transformer model, as well as a novel variant in which the encoder block combines information from nearby characters using convolutions. We perform extensive experiments on WMT and UN datasets, testing both bilingual and multilingual translation to English using up to three input languages (French, Spanish, and Chinese). Our transformer variant consistently outperforms the standard transformer at the character-level
Metrics
Downloads
Views
Additional indexing
Creators (Authors)
Event Title
Event Title
Event Title
Event Location
Event Location
Event Location
Event Start Date
Event Start Date
Event Start Date
Event End Date
Event End Date
Event End Date
Item Type
Item Type
Item Type
Language
Language
Language
Date available
Date available
Date available
OA Status
OA Status
OA Status
Free Access at
Free Access at
Free Access at
Publisher DOI
Metrics
Downloads
Views
Citations
Gao, Y., Nikolov, N. I., Hu, Y., & Hahnloser, R. H. (2020, July 1). Character-Level Translation with Self-attention. Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Online. https://doi.org/10.18653/v1/2020.acl-main.145