Julia Preobrazhenskaya


2023

pdf bib
WebNLG-Interno: Utilizing FRED-T5 to address the RDF-to-text problem (WebNLG 2023)
Maxim Kazakov | Julia Preobrazhenskaya | Ivan Bulychev | Aleksandr Shain
Proceedings of the Workshop on Multimodal, Multilingual Natural Language Generation and Multilingual WebNLG Challenge (MM-NLG 2023)

We present our solution for the Russian RDF002 to-text generation task of the WebNLG Challenge 2023. We use the pretrained large language model named FRED-T5 (Zmitrovich et al., 2023) to finetune on the train dataset. Also, we propose several types of prompt and run experiments to analyze their effectiveness. Our submission achieves 0.373 TER on the test dataset, taking the first place according to the results of the automatic evaluation and outperforming the best result of the previous challenge by 0.025. The code of our solution is available at the following link: https://github.com/Ivan30003/webnlg_interno