Decoding Strategies for Neural Referring Expression Generation

Sina Zarrieß, David Schlangen


Abstract
RNN-based sequence generation is now widely used in NLP and NLG (natural language generation). Most work focusses on how to train RNNs, even though also decoding is not necessarily straightforward: previous work on neural MT found seq2seq models to radically prefer short candidates, and has proposed a number of beam search heuristics to deal with this. In this work, we assess decoding strategies for referring expression generation with neural models. Here, expression length is crucial: output should neither contain too much or too little information, in order to be pragmatically adequate. We find that most beam search heuristics developed for MT do not generalize well to referring expression generation (REG), and do not generally outperform greedy decoding. We observe that beam search heuristics for termination seem to override the model’s knowledge of what a good stopping point is. Therefore, we also explore a recent approach called trainable decoding, which uses a small network to modify the RNN’s hidden state for better decoding results. We find this approach to consistently outperform greedy decoding for REG.
Anthology ID:
W18-6563
Volume:
Proceedings of the 11th International Conference on Natural Language Generation
Month:
November
Year:
2018
Address:
Tilburg University, The Netherlands
Editors:
Emiel Krahmer, Albert Gatt, Martijn Goudbeek
Venue:
INLG
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
503–512
Language:
URL:
https://aclanthology.org/W18-6563
DOI:
10.18653/v1/W18-6563
Bibkey:
Cite (ACL):
Sina Zarrieß and David Schlangen. 2018. Decoding Strategies for Neural Referring Expression Generation. In Proceedings of the 11th International Conference on Natural Language Generation, pages 503–512, Tilburg University, The Netherlands. Association for Computational Linguistics.
Cite (Informal):
Decoding Strategies for Neural Referring Expression Generation (Zarrieß & Schlangen, INLG 2018)
Copy Citation:
PDF:
https://aclanthology.org/W18-6563.pdf
Data
MS COCORefCOCOReferItGame