How Far Can We Go with Just Out-of-the-box BERT Models?

Lucie Gattepaille


Abstract
Social media have been seen as a promising data source for pharmacovigilance. Howev-er, methods for automatic extraction of Adverse Drug Reactions from social media plat-forms such as Twitter still need further development before they can be included reliably in routine pharmacovigilance practices. As the Bidirectional Encoder Representations from Transformer (BERT) models have shown great performance in many major NLP tasks recently, we decided to test its performance on the SMM4H Shared Tasks 1 to 3, by submitting results of pretrained and fine-tuned BERT models without more added knowledge than the one carried in the training datasets and additional datasets. Our three submissions all ended up above average over all teams’ submissions: 0.766 F1 for task 1 (15% above the average of 0.665), 0.47 F1 for task 2 (2% above the average of 0.46) and 0.380 F1 score for task 3 (30% above the average of 0.292). Used in many of the high-ranking submission in the 2019 edition of the SMM4H Shared Task, BERT contin-ues to be state-of-the-art in ADR extraction for Twitter data.
Anthology ID:
2020.smm4h-1.14
Volume:
Proceedings of the Fifth Social Media Mining for Health Applications Workshop & Shared Task
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Graciela Gonzalez-Hernandez, Ari Z. Klein, Ivan Flores, Davy Weissenbacher, Arjun Magge, Karen O'Connor, Abeed Sarker, Anne-Lyse Minard, Elena Tutubalina, Zulfat Miftahutdinov, Ilseyar Alimova
Venue:
SMM4H
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
95–100
Language:
URL:
https://aclanthology.org/2020.smm4h-1.14
DOI:
Bibkey:
Cite (ACL):
Lucie Gattepaille. 2020. How Far Can We Go with Just Out-of-the-box BERT Models?. In Proceedings of the Fifth Social Media Mining for Health Applications Workshop & Shared Task, pages 95–100, Barcelona, Spain (Online). Association for Computational Linguistics.
Cite (Informal):
How Far Can We Go with Just Out-of-the-box BERT Models? (Gattepaille, SMM4H 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.smm4h-1.14.pdf
Data
SMM4H