Tree-Planted Transformers: Unidirectional Transformer Language Models with Implicit Syntactic Supervision

Ryo Yoshida, Taiga Someya, Yohei Oseki


Abstract
Syntactic Language Models (SLMs) can be trained efficiently to reach relatively high performance; however, they have trouble with inference efficiency due to the explicit generation of syntactic structures. In this paper, we propose a new method dubbed tree-planting: instead of explicitly generating syntactic structures, we “plant” trees into attention weights of unidirectional Transformer LMs to implicitly reflect syntactic structures of natural language. Specifically, unidirectional Transformer LMs trained with tree-planting will be called Tree-Planted Transformers (TPT), which inherit the training efficiency from SLMs without changing the inference efficiency of their underlying Transformer LMs. Targeted syntactic evaluations on the SyntaxGym benchmark demonstrated that TPTs, despite the lack of explicit generation of syntactic structures, significantly outperformed not only vanilla Transformer LMs but also various SLMs that generate hundreds of syntactic structures in parallel. This result suggests that TPTs can learn human-like syntactic knowledge as data-efficiently as SLMs while maintaining the modeling space of Transformer LMs unchanged.
Anthology ID:
2024.findings-acl.303
Volume:
Findings of the Association for Computational Linguistics ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand and virtual meeting
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5120–5134
Language:
URL:
https://aclanthology.org/2024.findings-acl.303
DOI:
Bibkey:
Cite (ACL):
Ryo Yoshida, Taiga Someya, and Yohei Oseki. 2024. Tree-Planted Transformers: Unidirectional Transformer Language Models with Implicit Syntactic Supervision. In Findings of the Association for Computational Linguistics ACL 2024, pages 5120–5134, Bangkok, Thailand and virtual meeting. Association for Computational Linguistics.
Cite (Informal):
Tree-Planted Transformers: Unidirectional Transformer Language Models with Implicit Syntactic Supervision (Yoshida et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.303.pdf