Syntax-Aware Graph-to-Graph Transformer for Semantic Role Labelling

Alireza Mohammadshahi, James Henderson


Abstract
Recent models have shown that incorporating syntactic knowledge into the semantic role labelling (SRL) task leads to a significant improvement. In this paper, we propose Syntax-aware Graph-to-Graph Transformer (SynG2G-Tr) model, which encodes the syntactic structure using a novel way to input graph relations as embeddings, directly into the self-attention mechanism of Transformer. This approach adds a soft bias towards attention patterns that follow the syntactic structure but also allows the model to use this information to learn alternative patterns. We evaluate our model on both span-based and dependency-based SRL datasets, and outperform previous alternative methods in both in-domain and out-of-domain settings, on CoNLL 2005 and CoNLL 2009 datasets.
Anthology ID:
2023.repl4nlp-1.15
Volume:
Proceedings of the 8th Workshop on Representation Learning for NLP (RepL4NLP 2023)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Burcu Can, Maximilian Mozes, Samuel Cahyawijaya, Naomi Saphra, Nora Kassner, Shauli Ravfogel, Abhilasha Ravichander, Chen Zhao, Isabelle Augenstein, Anna Rogers, Kyunghyun Cho, Edward Grefenstette, Lena Voita
Venue:
RepL4NLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
174–186
Language:
URL:
https://aclanthology.org/2023.repl4nlp-1.15
DOI:
10.18653/v1/2023.repl4nlp-1.15
Bibkey:
Cite (ACL):
Alireza Mohammadshahi and James Henderson. 2023. Syntax-Aware Graph-to-Graph Transformer for Semantic Role Labelling. In Proceedings of the 8th Workshop on Representation Learning for NLP (RepL4NLP 2023), pages 174–186, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Syntax-Aware Graph-to-Graph Transformer for Semantic Role Labelling (Mohammadshahi & Henderson, RepL4NLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.repl4nlp-1.15.pdf
Video:
 https://aclanthology.org/2023.repl4nlp-1.15.mp4