System description for ProfNER - SMMH: Optimized finetuning of a pretrained transformer and word vectors

David Carreto Fidalgo, Daniel Vila-Suero, Francisco Aranda Montes, Ignacio Talavera Cepeda


Abstract
This shared task system description depicts two neural network architectures submitted to the ProfNER track, among them the winning system that scored highest in the two sub-tasks 7a and 7b. We present in detail the approach, preprocessing steps and the architectures used to achieve the submitted results, and also provide a GitHub repository to reproduce the scores. The winning system is based on a transformer-based pretrained language model and solves the two sub-tasks simultaneously.
Anthology ID:
2021.smm4h-1.11
Volume:
Proceedings of the Sixth Social Media Mining for Health (#SMM4H) Workshop and Shared Task
Month:
June
Year:
2021
Address:
Mexico City, Mexico
Editors:
Arjun Magge, Ari Klein, Antonio Miranda-Escalada, Mohammed Ali Al-garadi, Ilseyar Alimova, Zulfat Miftahutdinov, Eulalia Farre-Maduell, Salvador Lima Lopez, Ivan Flores, Karen O'Connor, Davy Weissenbacher, Elena Tutubalina, Abeed Sarker, Juan M Banda, Martin Krallinger, Graciela Gonzalez-Hernandez
Venue:
SMM4H
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
69–73
Language:
URL:
https://aclanthology.org/2021.smm4h-1.11
DOI:
10.18653/v1/2021.smm4h-1.11
Bibkey:
Cite (ACL):
David Carreto Fidalgo, Daniel Vila-Suero, Francisco Aranda Montes, and Ignacio Talavera Cepeda. 2021. System description for ProfNER - SMMH: Optimized finetuning of a pretrained transformer and word vectors. In Proceedings of the Sixth Social Media Mining for Health (#SMM4H) Workshop and Shared Task, pages 69–73, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
System description for ProfNER - SMMH: Optimized finetuning of a pretrained transformer and word vectors (Carreto Fidalgo et al., SMM4H 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.smm4h-1.11.pdf