Semantic Frame Induction with Deep Metric Learning

Kosuke Yamada, Ryohei Sasano, Koichi Takeda


Abstract
Recent studies have demonstrated the usefulness of contextualized word embeddings in unsupervised semantic frame induction. However, they have also revealed that generic contextualized embeddings are not always consistent with human intuitions about semantic frames, which causes unsatisfactory performance for frame induction based on contextualized embeddings. In this paper, we address supervised semantic frame induction, which assumes the existence of frame-annotated data for a subset of predicates in a corpus and aims to build a frame induction model that leverages the annotated data. We propose a model that uses deep metric learning to fine-tune a contextualized embedding model, and we apply the fine-tuned contextualized embeddings to perform semantic frame induction. Our experiments on FrameNet show that fine-tuning with deep metric learning considerably improves the clustering evaluation scores, namely, the B-cubed F-score and Purity F-score, by about 8 points or more. We also demonstrate that our approach is effective even when the number of training instances is small.
Anthology ID:
2023.eacl-main.134
Volume:
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1833–1845
Language:
URL:
https://aclanthology.org/2023.eacl-main.134
DOI:
10.18653/v1/2023.eacl-main.134
Bibkey:
Cite (ACL):
Kosuke Yamada, Ryohei Sasano, and Koichi Takeda. 2023. Semantic Frame Induction with Deep Metric Learning. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, pages 1833–1845, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Semantic Frame Induction with Deep Metric Learning (Yamada et al., EACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.eacl-main.134.pdf
Video:
 https://aclanthology.org/2023.eacl-main.134.mp4