HULAT-UC3M at BiolaySumm: Adaptation of BioBART and Longformer models to summarizing biomedical documents

Adrian Gonzalez Sanchez, Paloma Martínez


Abstract
This article presents our submission to the Bio- LaySumm 2024 shared task: Lay Summarization of Biomedical Research Articles. The objective of this task is to generate summaries that are simplified in a concise and less technical way, in order to facilitate comprehension by non-experts users. A pre-trained BioBART model was employed to fine-tune the articles from the two journals, thereby generating two models, one for each journal. The submission achieved the 12th best ranking in the task, attaining a meritorious first place in the Relevance ROUGE-1 metric.
Anthology ID:
2024.bionlp-1.71
Volume:
Proceedings of the 23rd Workshop on Biomedical Natural Language Processing
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Dina Demner-Fushman, Sophia Ananiadou, Makoto Miwa, Kirk Roberts, Junichi Tsujii
Venues:
BioNLP | WS
SIG:
SIGBIOMED
Publisher:
Association for Computational Linguistics
Note:
Pages:
780–785
Language:
URL:
https://aclanthology.org/2024.bionlp-1.71
DOI:
10.18653/v1/2024.bionlp-1.71
Bibkey:
Cite (ACL):
Adrian Gonzalez Sanchez and Paloma Martínez. 2024. HULAT-UC3M at BiolaySumm: Adaptation of BioBART and Longformer models to summarizing biomedical documents. In Proceedings of the 23rd Workshop on Biomedical Natural Language Processing, pages 780–785, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
HULAT-UC3M at BiolaySumm: Adaptation of BioBART and Longformer models to summarizing biomedical documents (Gonzalez Sanchez & Martínez, BioNLP-WS 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.bionlp-1.71.pdf