Enlivening Redundant Heads in Multi-head Self-attention for Machine Translation

Tianfu Zhang, Heyan Huang, Chong Feng, Longbing Cao


Abstract
Multi-head self-attention recently attracts enormous interest owing to its specialized functions, significant parallelizable computation, and flexible extensibility. However, very recent empirical studies show that some self-attention heads make little contribution and can be pruned as redundant heads. This work takes a novel perspective of identifying and then vitalizing redundant heads. We propose a redundant head enlivening (RHE) method to precisely identify redundant heads, and then vitalize their potential by learning syntactic relations and prior knowledge in the text without sacrificing the roles of important heads. Two novel syntax-enhanced attention (SEA) mechanisms: a dependency mask bias and a relative local-phrasal position bias, are introduced to revise self-attention distributions for syntactic enhancement in machine translation. The importance of individual heads is dynamically evaluated during the redundant heads identification, on which we apply SEA to vitalize redundant heads while maintaining the strength of important heads. Experimental results on widely adopted WMT14 and WMT16 English to German and English to Czech language machine translation validate the RHE effectiveness.
Anthology ID:
2021.emnlp-main.260
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3238–3248
Language:
URL:
https://aclanthology.org/2021.emnlp-main.260
DOI:
10.18653/v1/2021.emnlp-main.260
Bibkey:
Cite (ACL):
Tianfu Zhang, Heyan Huang, Chong Feng, and Longbing Cao. 2021. Enlivening Redundant Heads in Multi-head Self-attention for Machine Translation. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 3238–3248, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Enlivening Redundant Heads in Multi-head Self-attention for Machine Translation (Zhang et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.260.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.260.mp4