Buy Tesla, Sell Ford: Assessing Implicit Stock Market Preference in Pre-trained Language Models

Chengyu Chuang, Yi Yang


Abstract
Pretrained language models such as BERT have achieved remarkable success in several NLP tasks. With the wide adoption of BERT in real-world applications, researchers begin to investigate the implicit biases encoded in the BERT. In this paper, we assess the implicit stock market preferences in BERT and its finance domain-specific model FinBERT. We find some interesting patterns. For example, the language models are overall more positive towards the stock market, but there are significant differences in preferences between a pair of industry sectors, or even within a sector. Given the prevalence of NLP models in financial decision making systems, this work raises the awareness of their potential implicit preferences in the stock markets. Awareness of such problems can help practitioners improve robustness and accountability of their financial NLP pipelines .
Anthology ID:
2022.acl-short.12
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
100–105
Language:
URL:
https://aclanthology.org/2022.acl-short.12
DOI:
10.18653/v1/2022.acl-short.12
Bibkey:
Cite (ACL):
Chengyu Chuang and Yi Yang. 2022. Buy Tesla, Sell Ford: Assessing Implicit Stock Market Preference in Pre-trained Language Models. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 100–105, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Buy Tesla, Sell Ford: Assessing Implicit Stock Market Preference in Pre-trained Language Models (Chuang & Yang, ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-short.12.pdf