Multilingual Chart-based Constituency Parse Extraction from Pre-trained Language Models

Taeuk Kim, Bowen Li, Sang-goo Lee


Abstract
As it has been unveiled that pre-trained language models (PLMs) are to some extent capable of recognizing syntactic concepts in natural language, much effort has been made to develop a method for extracting complete (binary) parses from PLMs without training separate parsers. We improve upon this paradigm by proposing a novel chart-based method and an effective top-K ensemble technique. Moreover, we demonstrate that we can broaden the scope of application of the approach into multilingual settings. Specifically, we show that by applying our method on multilingual PLMs, it becomes possible to induce non-trivial parses for sentences from nine languages in an integrated and language-agnostic manner, attaining performance superior or comparable to that of unsupervised PCFGs. We also verify that our approach is robust to cross-lingual transfer. Finally, we provide analyses on the inner workings of our method. For instance, we discover universal attention heads which are consistently sensitive to syntactic information irrespective of the input language.
Anthology ID:
2021.findings-emnlp.41
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
454–463
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.41
DOI:
10.18653/v1/2021.findings-emnlp.41
Bibkey:
Cite (ACL):
Taeuk Kim, Bowen Li, and Sang-goo Lee. 2021. Multilingual Chart-based Constituency Parse Extraction from Pre-trained Language Models. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 454–463, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Multilingual Chart-based Constituency Parse Extraction from Pre-trained Language Models (Kim et al., Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.41.pdf
Video:
 https://aclanthology.org/2021.findings-emnlp.41.mp4