Nearest Neighbor Language Models for Stylistic Controllable Generation

Severino Trotta, Lucie Flek, Charles Welch


Abstract
Recent language modeling performance has been greatly improved by the use of external memory. This memory encodes the context so that similar contexts can be recalled during decoding. This similarity depends on how the model learns to encode context, which can be altered to include other attributes, such as style. We construct and evaluate an architecture for this purpose, using corpora annotated for politeness, formality, and toxicity. Through extensive experiments and human evaluation we demonstrate the potential of our method to generate text while controlling style. We find that style-specific datastores improve generation performance, though results vary greatly across styles, and the effect of pretraining data and specific styles should be explored in future work.
Anthology ID:
2022.gem-1.25
Volume:
Proceedings of the 2nd Workshop on Natural Language Generation, Evaluation, and Metrics (GEM)
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates (Hybrid)
Editors:
Antoine Bosselut, Khyathi Chandu, Kaustubh Dhole, Varun Gangal, Sebastian Gehrmann, Yacine Jernite, Jekaterina Novikova, Laura Perez-Beltrachini
Venue:
GEM
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
295–305
Language:
URL:
https://aclanthology.org/2022.gem-1.25
DOI:
10.18653/v1/2022.gem-1.25
Bibkey:
Cite (ACL):
Severino Trotta, Lucie Flek, and Charles Welch. 2022. Nearest Neighbor Language Models for Stylistic Controllable Generation. In Proceedings of the 2nd Workshop on Natural Language Generation, Evaluation, and Metrics (GEM), pages 295–305, Abu Dhabi, United Arab Emirates (Hybrid). Association for Computational Linguistics.
Cite (Informal):
Nearest Neighbor Language Models for Stylistic Controllable Generation (Trotta et al., GEM 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.gem-1.25.pdf
Video:
 https://aclanthology.org/2022.gem-1.25.mp4