DE: Wartungsfenster - Dienstag, den 20.1.2026 wird es zwischen 14:00 Uhr und 14:20 Uhr wegen dringender Wartungsarbeiten zu zeitweisen Unterbrüchen auf ZORA kommen. Wir bitten um Verständnis.

EN: Maintenance window - Tuesday, January 20, 2026, there will be temporary interruptions to ZORA between 2:00 p.m. and 2:20 p.m. due to urgent maintenance work. We apologize for any inconvenience.

 

Publication:

On Biasing Transformer Attention Towards Monotonicity

Date

Date

Date
2021
Conference or Workshop Item
Published version

Citations

Citation copied

Rios, A., Amrhein, C., Aepli, N., & Sennrich, R. (2021). On Biasing Transformer Attention Towards Monotonicity. Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 4474–4488. https://www.aclweb.org/anthology/2021.naacl-main.354

Abstract

Abstract

Abstract

Many sequence-to-sequence tasks in natural language processing are roughly monotonic in the alignment between source and target sequence, and previous work has facilitated or enforced learning of monotonic attention behavior via specialized attention functions or pretraining. In this work, we introduce a monotonicity loss function that is compatible with standard attention mechanisms and test it on several sequence-to-sequence tasks: grapheme-to-phoneme conversion, morphological inflection, transliteration, and dialect normalization.

Metrics

Downloads

5 since deposited on 2021-05-25
2last week
Acq. date: 2025-11-14

Views

1 since deposited on 2021-05-25
Acq. date: 2025-11-14

Additional indexing

Creators (Authors)

Event Title

Event Title

Event Title
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies

Event Location

Event Location

Event Location
Online

Event Start Date

Event Start Date

Event Start Date
2021-06-06

Event End Date

Event End Date

Event End Date
2021-06-11

Page range/Item number

Page range/Item number

Page range/Item number
4474

Page end

Page end

Page end
4488

Item Type

Item Type

Item Type
Conference or Workshop Item

Dewey Decimal Classifikation

Dewey Decimal Classifikation

Dewey Decimal Classifikation

Language

Language

Language
English

Date available

Date available

Date available
2021-05-25

OA Status

OA Status

OA Status
Green

Free Access at

Free Access at

Free Access at
Official URL

Official URL

Official URL

Official URL

Metrics

Downloads

5 since deposited on 2021-05-25
2last week
Acq. date: 2025-11-14

Views

1 since deposited on 2021-05-25
Acq. date: 2025-11-14

Citations

Citation copied

Rios, A., Amrhein, C., Aepli, N., & Sennrich, R. (2021). On Biasing Transformer Attention Towards Monotonicity. Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 4474–4488. https://www.aclweb.org/anthology/2021.naacl-main.354

Green Open Access
Loading...
Thumbnail Image

Files

Files

Files
Files available to download:1

Files

Files

Files
Files available to download:1
Loading...
Thumbnail Image