Header

UZH-Logo

Maintenance Infos

Analyzing Multi-Head Self-Attention: Specialized Heads Do the Heavy Lifting, the Rest Can Be Pruned


Voita, Elena; Talbot, David; Moiseev, Fedor; Sennrich, Rico; Titov, Ivan (2019). Analyzing Multi-Head Self-Attention: Specialized Heads Do the Heavy Lifting, the Rest Can Be Pruned. In: Proceedings of the 57th Conference of the Association for Computational Linguistics, Florence, Italy, 28 July 2019 - 2 August 2019, 5797-5808.

Statistics

Downloads

4 downloads since deposited on 21 Aug 2019
4 downloads since 12 months
Detailed statistics

Additional indexing

Item Type:Conference or Workshop Item (Paper), original work
Communities & Collections:06 Faculty of Arts > Institute of Computational Linguistics
Dewey Decimal Classification:000 Computer science, knowledge & systems
410 Linguistics
Language:English
Event End Date:2 August 2019
Deposited On:21 Aug 2019 07:47
Last Modified:02 Nov 2019 13:37
Publisher:Association for Computational Linguistics
OA Status:Green
Official URL:https://www.aclweb.org/anthology/P19-1580

Download

Green Open Access

Download PDF  'Analyzing Multi-Head Self-Attention: Specialized Heads Do the Heavy Lifting, the Rest Can Be Pruned'.
Preview
Content: Accepted Version
Language: English
Filetype: PDF
Size: 3MB
Licence: Creative Commons: Attribution 4.0 International (CC BY 4.0)