AMR Alignment for Morphologically-rich and Pro-drop Languages

K. Elif Oral, Gülşen Eryiğit


Abstract
Alignment between concepts in an abstract meaning representation (AMR) graph and the words within a sentence is one of the important stages of AMR parsing. Although there exist high performing AMR aligners for English, unfortunately, these are not well suited for many languages where many concepts appear from morpho-semantic elements. For the first time in the literature, this paper presents an AMR aligner tailored for morphologically-rich and pro-drop languages by experimenting on the Turkish language being a prominent example of this language group. Our aligner focuses on the meaning considering the rich Turkish morphology and aligns AMR concepts that emerge from morphemes using a tree traversal approach without additional resources or rules. We evaluate our aligner over a manually annotated gold data set in terms of precision, recall and F1 score. Our aligner outperforms the Turkish adaptations of the previously proposed aligners for English and Portuguese by an F1 score of 0.87 and provides a relative error reduction of up to 76%.
Anthology ID:
2022.acl-srw.13
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Samuel Louvan, Andrea Madotto, Brielen Madureira
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
143–152
Language:
URL:
https://aclanthology.org/2022.acl-srw.13
DOI:
10.18653/v1/2022.acl-srw.13
Bibkey:
Cite (ACL):
K. Elif Oral and Gülşen Eryiğit. 2022. AMR Alignment for Morphologically-rich and Pro-drop Languages. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop, pages 143–152, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
AMR Alignment for Morphologically-rich and Pro-drop Languages (Oral & Eryiğit, ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-srw.13.pdf