Exploring the Impact of Corpus Diversity on Financial Pretrained Language Models

Jaeyoung Choe, Keonwoong Noh, Nayeon Kim, Seyun Ahn, Woohwan Jung


Abstract
Over the past few years, various domain-specific pretrained language models (PLMs) have been proposed and have outperformed general-domain PLMs in specialized areas such as biomedical, scientific, and clinical domains. In addition, financial PLMs have been studied because of the high economic impact of financial data analysis. However, we found that financial PLMs were not pretrained on sufficiently diverse financial data. This lack of diverse training data leads to a subpar generalization performance, resulting in general-purpose PLMs, including BERT, often outperforming financial PLMs on many downstream tasks. To address this issue, we collected a broad range of financial corpus and trained the Financial Language Model (FiLM) on these diverse datasets. Our experimental results confirm that FiLM outperforms not only existing financial PLMs but also general domain PLMs. Furthermore, we provide empirical evidence that this improvement can be achieved even for unseen corpus groups.
Anthology ID:
2023.findings-emnlp.138
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2101–2112
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.138
DOI:
10.18653/v1/2023.findings-emnlp.138
Bibkey:
Cite (ACL):
Jaeyoung Choe, Keonwoong Noh, Nayeon Kim, Seyun Ahn, and Woohwan Jung. 2023. Exploring the Impact of Corpus Diversity on Financial Pretrained Language Models. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 2101–2112, Singapore. Association for Computational Linguistics.
Cite (Informal):
Exploring the Impact of Corpus Diversity on Financial Pretrained Language Models (Choe et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.138.pdf