Improved Latent Tree Induction with Distant Supervision via Span Constraints

Zhiyang Xu, Andrew Drozdov, Jay Yoon Lee, Tim O’Gorman, Subendhu Rongali, Dylan Finkbeiner, Shilpa Suresh, Mohit Iyyer, Andrew McCallum


Abstract
For over thirty years, researchers have developed and analyzed methods for latent tree induction as an approach for unsupervised syntactic parsing. Nonetheless, modern systems still do not perform well enough compared to their supervised counterparts to have any practical use as structural annotation of text. In this work, we present a technique that uses distant supervision in the form of span constraints (i.e. phrase bracketing) to improve performance in unsupervised constituency parsing. Using a relatively small number of span constraints we can substantially improve the output from DIORA, an already competitive unsupervised parsing system. Compared with full parse tree annotation, span constraints can be acquired with minimal effort, such as with a lexicon derived from Wikipedia, to find exact text matches. Our experiments show span constraints based on entities improves constituency parsing on English WSJ Penn Treebank by more than 5 F1. Furthermore, our method extends to any domain where span constraints are easily attainable, and as a case study we demonstrate its effectiveness by parsing biomedical text from the CRAFT dataset.
Anthology ID:
2021.emnlp-main.395
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4818–4831
Language:
URL:
https://aclanthology.org/2021.emnlp-main.395
DOI:
10.18653/v1/2021.emnlp-main.395
Bibkey:
Cite (ACL):
Zhiyang Xu, Andrew Drozdov, Jay Yoon Lee, Tim O’Gorman, Subendhu Rongali, Dylan Finkbeiner, Shilpa Suresh, Mohit Iyyer, and Andrew McCallum. 2021. Improved Latent Tree Induction with Distant Supervision via Span Constraints. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 4818–4831, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Improved Latent Tree Induction with Distant Supervision via Span Constraints (Xu et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.395.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.395.mp4
Code
 iesl/distantly-supervised-diora
Data
MedMentionsPenn Treebank