Seungyeon Kim


2024

pdf bib
Analysis of Plan-based Retrieval for Grounded Text Generation
Ameya Godbole | Nicholas Monath | Seungyeon Kim | Ankit Singh Rawat | Andrew McCallum | Manzil Zaheer
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing

In text generation, hallucinations refer to the generation of seemingly coherent text that contradicts established knowledge. One compelling hypothesis is that hallucinations occur when a language model is given a generation task outside its parametric knowledge (due to rarity, recency, domain, etc.). A common strategy to address this limitation is to infuse the language models with retrieval mechanisms, providing the model with relevant knowledge for the task. In this paper, we leverage the planning capabilities of instruction-tuned LLMs and analyze how planning can be used to guide retrieval to further reduce the frequency of hallucinations. We empirically evaluate several variations of our proposed approach on long-form text generation tasks. By improving the coverage of relevant facts, plan-guided retrieval and generation can produce more informative responses while providing a higher rate of attribution to source documents.

2020

pdf bib
Semantic Label Smoothing for Sequence to Sequence Problems
Michal Lukasik | Himanshu Jain | Aditya Menon | Seungyeon Kim | Srinadh Bhojanapalli | Felix Yu | Sanjiv Kumar
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)

Label smoothing has been shown to be an effective regularization strategy in classification, that prevents overfitting and helps in label de-noising. However, extending such methods directly to seq2seq settings, such as Machine Translation, is challenging: the large target output space of such problems makes it intractable to apply label smoothing over all possible outputs. Most existing approaches for seq2seq settings either do token level smoothing, or smooth over sequences generated by randomly substituting tokens in the target sequence. Unlike these works, in this paper, we propose a technique that smooths over well formed relevant sequences that not only have sufficient n-gram overlap with the target sequence, but are also semantically similar. Our method shows a consistent and significant improvement over the state-of-the-art techniques on different datasets.

2010

pdf bib
Local Space-Time Smoothing for Version Controlled Documents
Seungyeon Kim | Guy Lebanon
Coling 2010: Posters