Samuel Caetano Silva
2023
BERTabaporu: Assessing a Genre-Specific Language Model for Portuguese NLP
Pablo Botton Costa
|
Matheus Camasmie Pavan
|
Wesley Ramos Santos
|
Samuel Caetano Silva
|
Ivandré Paraboni
Proceedings of the 14th International Conference on Recent Advances in Natural Language Processing
Transformer-based language models such as Bidirectional Encoder Representations from Transformers (BERT) are now mainstream in the NLP field, but extensions to languages other than English, to new domains and/or to more specific text genres are still in demand. In this paper we introduced BERTabaporu, a BERT language model that has been pre-trained on Twitter data in the Brazilian Portuguese language. The model is shown to outperform the best-known general-purpose model for this language in three Twitter-related NLP tasks, making a potentially useful resource for Portuguese NLP in general.
Search