Punctuation Restoration Improves Structure Understanding without Supervision

Junghyun Min, Minho Lee, Woochul Lee, Yeonsoo Lee


Abstract
Unsupervised learning objectives like autoregressive and masked language modeling constitute a significant part in producing pre-trained representations that perform various downstream applications from natural language understanding to conversational tasks. However, despite impressive generative capabilities of recent large language models, their abilities to capture syntactic or semantic structure within text lag behind. We hypothesize that the mismatch between linguistic performance and competence in machines is attributable to insufficient learning of linguistic structure knowledge via currently popular pre-training objectives. Working with English, we show that punctuation restoration as a learning objective improves performance on structure-related tasks like named entity recognition, open information extraction, chunking, and part-of-speech tagging. Punctuation restoration results in ▲≥2%p improvement in 16 out of 18 experiments, across 6 out of 7 tasks. Our results show that punctuation restoration is an effective learning objective that can improve structure understanding and yield a more robust structure-aware representations of natural language in base-sized models.
Anthology ID:
2025.repl4nlp-1.10
Volume:
Proceedings of the 10th Workshop on Representation Learning for NLP (RepL4NLP-2025)
Month:
May
Year:
2025
Address:
Albuquerque, NM
Editors:
Vaibhav Adlakha, Alexandra Chronopoulou, Xiang Lorraine Li, Bodhisattwa Prasad Majumder, Freda Shi, Giorgos Vernikos
Venues:
RepL4NLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
120–130
Language:
URL:
https://aclanthology.org/2025.repl4nlp-1.10/
DOI:
Bibkey:
Cite (ACL):
Junghyun Min, Minho Lee, Woochul Lee, and Yeonsoo Lee. 2025. Punctuation Restoration Improves Structure Understanding without Supervision. In Proceedings of the 10th Workshop on Representation Learning for NLP (RepL4NLP-2025), pages 120–130, Albuquerque, NM. Association for Computational Linguistics.
Cite (Informal):
Punctuation Restoration Improves Structure Understanding without Supervision (Min et al., RepL4NLP 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.repl4nlp-1.10.pdf