Alleviating the Inequality of Attention Heads for Neural Machine Translation

Zewei Sun, Shujian Huang, Xinyu Dai, Jiajun Chen


Abstract
Recent studies show that the attention heads in Transformer are not equal. We relate this phenomenon to the imbalance training of multi-head attention and the model dependence on specific heads. To tackle this problem, we propose a simple masking method: HeadMask, in two specific ways. Experiments show that translation improvements are achieved on multiple language pairs. Subsequent empirical analyses also support our assumption and confirm the effectiveness of the method.
Anthology ID:
2022.coling-1.466
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Editors:
Nicoletta Calzolari, Chu-Ren Huang, Hansaem Kim, James Pustejovsky, Leo Wanner, Key-Sun Choi, Pum-Mo Ryu, Hsin-Hsi Chen, Lucia Donatelli, Heng Ji, Sadao Kurohashi, Patrizia Paggio, Nianwen Xue, Seokhwan Kim, Younggyun Hahm, Zhong He, Tony Kyungil Lee, Enrico Santus, Francis Bond, Seung-Hoon Na
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
5246–5250
Language:
URL:
https://aclanthology.org/2022.coling-1.466
DOI:
Bibkey:
Cite (ACL):
Zewei Sun, Shujian Huang, Xinyu Dai, and Jiajun Chen. 2022. Alleviating the Inequality of Attention Heads for Neural Machine Translation. In Proceedings of the 29th International Conference on Computational Linguistics, pages 5246–5250, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Alleviating the Inequality of Attention Heads for Neural Machine Translation (Sun et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.466.pdf
Data
IWSLT2015WMT 2016WMT 2016 News