Strengthening Structural Inductive Biases by Pre-training to Perform Syntactic Transformations

Matthias Lindemann, Alexander Koller, Ivan Titov


Abstract
Models need appropriate inductive biases to effectively learn from small amounts of data and generalize systematically outside of the training distribution. While Transformers are highly versatile and powerful, they can still benefit from enhanced structural inductive biases for seq2seq tasks, especially those involving syntactic transformations, such as converting active to passive voice or semantic parsing. In this paper, we propose to strengthen the structural inductive bias of a Transformer by intermediate pre-training to perform synthetically generated syntactic transformations of dependency trees given a description of the transformation. Our experiments confirm that this helps with few-shot learning of syntactic tasks such as chunking, and also improves structural generalization for semantic parsing. Our analysis shows that the intermediate pre-training leads to attention heads that keep track of which syntactic transformation needs to be applied to which token, and that the model can leverage these attention heads on downstream tasks.
Anthology ID:
2024.emnlp-main.645
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11558–11573
Language:
URL:
https://aclanthology.org/2024.emnlp-main.645/
DOI:
10.18653/v1/2024.emnlp-main.645
Bibkey:
Cite (ACL):
Matthias Lindemann, Alexander Koller, and Ivan Titov. 2024. Strengthening Structural Inductive Biases by Pre-training to Perform Syntactic Transformations. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 11558–11573, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Strengthening Structural Inductive Biases by Pre-training to Perform Syntactic Transformations (Lindemann et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.645.pdf
Software:
 2024.emnlp-main.645.software.zip