An Analysis of Attention in German Verbal Idiom Disambiguation

Rafael Ehren, Laura Kallmeyer, Timm Lichte


Abstract
In this paper we examine a BiLSTM architecture for disambiguating verbal potentially idiomatic expressions (PIEs) as to whether they are used in a literal or an idiomatic reading with respect to explainability of its decisions. Concretely, we extend the BiLSTM with an additional attention mechanism and track the elements that get the highest attention. The goal is to better understand which parts of an input sentence are particularly discriminative for the classifier’s decision, based on the assumption that these elements receive a higher attention than others. In particular, we investigate POS tags and dependency relations to PIE verbs for the tokens with the maximal attention. It turns out that the elements with maximal attention are oftentimes nouns that are the subjects of the PIE verb. For longer sentences however (i.e., sentences containing, among others, more modifiers), the highest attention word often stands in a modifying relation to the PIE components. This is particularly frequent for PIEs classified as literal. Our study shows that an attention mechanism can contribute to the explainability of classification decisions that depend on specific cues in the sentential context, as it is the case for PIE disambiguation.
Anthology ID:
2022.mwe-1.5
Volume:
Proceedings of the 18th Workshop on Multiword Expressions @LREC2022
Month:
June
Year:
2022
Address:
Marseille, France
Venue:
MWE
SIG:
SIGLEX
Publisher:
European Language Resources Association
Note:
Pages:
16–25
Language:
URL:
https://aclanthology.org/2022.mwe-1.5
DOI:
Bibkey:
Cite (ACL):
Rafael Ehren, Laura Kallmeyer, and Timm Lichte. 2022. An Analysis of Attention in German Verbal Idiom Disambiguation. In Proceedings of the 18th Workshop on Multiword Expressions @LREC2022, pages 16–25, Marseille, France. European Language Resources Association.
Cite (Informal):
An Analysis of Attention in German Verbal Idiom Disambiguation (Ehren et al., MWE 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.mwe-1.5.pdf
Optional supplementary material:
 2022.mwe-1.5.OptionalSupplementaryMaterial.pdf
Code
 rafehr/pie-attention