Pre-trained Transformer-based Classification and Span Detection Models for Social Media Health Applications

Yuting Guo, Yao Ge, Mohammed Ali Al-Garadi, Abeed Sarker


Abstract
This paper describes our approach for six classification tasks (Tasks 1a, 3a, 3b, 4 and 5) and one span detection task (Task 1b) from the Social Media Mining for Health (SMM4H) 2021 shared tasks. We developed two separate systems for classification and span detection, both based on pre-trained Transformer-based models. In addition, we applied oversampling and classifier ensembling in the classification tasks. The results of our submissions are over the median scores in all tasks except for Task 1a. Furthermore, our model achieved first place in Task 4 and obtained a 7% higher F1-score than the median in Task 1b.
Anthology ID:
2021.smm4h-1.8
Volume:
Proceedings of the Sixth Social Media Mining for Health (#SMM4H) Workshop and Shared Task
Month:
June
Year:
2021
Address:
Mexico City, Mexico
Editors:
Arjun Magge, Ari Klein, Antonio Miranda-Escalada, Mohammed Ali Al-garadi, Ilseyar Alimova, Zulfat Miftahutdinov, Eulalia Farre-Maduell, Salvador Lima Lopez, Ivan Flores, Karen O'Connor, Davy Weissenbacher, Elena Tutubalina, Abeed Sarker, Juan M Banda, Martin Krallinger, Graciela Gonzalez-Hernandez
Venue:
SMM4H
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
52–57
Language:
URL:
https://aclanthology.org/2021.smm4h-1.8
DOI:
10.18653/v1/2021.smm4h-1.8
Bibkey:
Cite (ACL):
Yuting Guo, Yao Ge, Mohammed Ali Al-Garadi, and Abeed Sarker. 2021. Pre-trained Transformer-based Classification and Span Detection Models for Social Media Health Applications. In Proceedings of the Sixth Social Media Mining for Health (#SMM4H) Workshop and Shared Task, pages 52–57, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Pre-trained Transformer-based Classification and Span Detection Models for Social Media Health Applications (Guo et al., SMM4H 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.smm4h-1.8.pdf