A Generative Model for Punctuation in Dependency Trees

Xiang Lisa Li, Dingquan Wang, Jason Eisner


Abstract
Treebanks traditionally treat punctuation marks as ordinary words, but linguists have suggested that a tree’s “true” punctuation marks are not observed (Nunberg, 1990). These latent “underlying” marks serve to delimit or separate constituents in the syntax tree. When the tree’s yield is rendered as a written sentence, a string rewriting mechanism transduces the underlying marks into “surface” marks, which are part of the observed (surface) string but should not be regarded as part of the tree. We formalize this idea in a generative model of punctuation that admits efficient dynamic programming. We train it without observing the underlying marks, by locally maximizing the incomplete data likelihood (similarly to the EM algorithm). When we use the trained model to reconstruct the tree’s underlying punctuation, the results appear plausible across 5 languages, and in particular are consistent with Nunberg’s analysis of English. We show that our generative model can be used to beat baselines on punctuation restoration. Also, our reconstruction of a sentence’s underlying punctuation lets us appropriately render the surface punctuation (via our trained underlying-to-surface mechanism) when we syntactically transform the sentence.
Anthology ID:
Q19-1023
Volume:
Transactions of the Association for Computational Linguistics, Volume 7
Month:
Year:
2019
Address:
Cambridge, MA
Editors:
Lillian Lee, Mark Johnson, Brian Roark, Ani Nenkova
Venue:
TACL
SIG:
Publisher:
MIT Press
Note:
Pages:
357–373
Language:
URL:
https://aclanthology.org/Q19-1023
DOI:
10.1162/tacl_a_00273
Bibkey:
Cite (ACL):
Xiang Lisa Li, Dingquan Wang, and Jason Eisner. 2019. A Generative Model for Punctuation in Dependency Trees. Transactions of the Association for Computational Linguistics, 7:357–373.
Cite (Informal):
A Generative Model for Punctuation in Dependency Trees (Li et al., TACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/Q19-1023.pdf