WebNLG-Interno: Utilizing FRED-T5 to address the RDF-to-text problem (WebNLG 2023)

Maxim Kazakov, Julia Preobrazhenskaya, Ivan Bulychev, Aleksandr Shain


Abstract
We present our solution for the Russian RDF002 to-text generation task of the WebNLG Challenge 2023. We use the pretrained large language model named FRED-T5 (Zmitrovich et al., 2023) to finetune on the train dataset. Also, we propose several types of prompt and run experiments to analyze their effectiveness. Our submission achieves 0.373 TER on the test dataset, taking the first place according to the results of the automatic evaluation and outperforming the best result of the previous challenge by 0.025. The code of our solution is available at the following link: https://github.com/Ivan30003/webnlg_interno
Anthology ID:
2023.mmnlg-1.7
Volume:
Proceedings of the Workshop on Multimodal, Multilingual Natural Language Generation and Multilingual WebNLG Challenge (MM-NLG 2023)
Month:
September
Year:
2023
Address:
Prague, Czech Republic
Editors:
Albert Gatt, Claire Gardent, Liam Cripwell, Anya Belz, Claudia Borg, Aykut Erdem, Erkut Erdem
Venues:
MMNLG | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
67–72
Language:
URL:
https://aclanthology.org/2023.mmnlg-1.7
DOI:
Bibkey:
Cite (ACL):
Maxim Kazakov, Julia Preobrazhenskaya, Ivan Bulychev, and Aleksandr Shain. 2023. WebNLG-Interno: Utilizing FRED-T5 to address the RDF-to-text problem (WebNLG 2023). In Proceedings of the Workshop on Multimodal, Multilingual Natural Language Generation and Multilingual WebNLG Challenge (MM-NLG 2023), pages 67–72, Prague, Czech Republic. Association for Computational Linguistics.
Cite (Informal):
WebNLG-Interno: Utilizing FRED-T5 to address the RDF-to-text problem (WebNLG 2023) (Kazakov et al., MMNLG-WS 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.mmnlg-1.7.pdf