Measuring Consistency in Text-based Financial Forecasting Models

Linyi Yang, Yingpeng Ma, Yue Zhang


Abstract
Financial forecasting has been an important and active area of machine learning research, as even the most modest advantages in predictive accuracy can be parlayed into significant financial gains. Recent advances in natural language processing (NLP) bring the opportunity to leverage textual data, such as earnings reports of publicly traded companies, to predict the return rate for an asset. However, when dealing with such a sensitive task, the consistency of models – their invariance under meaning-preserving alternations in input – is a crucial property for building user trust. Despite this, current methods for financial forecasting do not take consistency into consideration. To address this issue, we propose FinTrust, an evaluation tool that assesses logical consistency in financial text. Using FinTrust, we show that the consistency of state-of-the-art NLP models for financial forecasting is poor. Our analysis of the performance degradation caused by meaning-preserving alternations suggests that current text-based methods are not suitable for robustly predicting market information.
Anthology ID:
2023.acl-long.769
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13751–13765
Language:
URL:
https://aclanthology.org/2023.acl-long.769
DOI:
10.18653/v1/2023.acl-long.769
Bibkey:
Cite (ACL):
Linyi Yang, Yingpeng Ma, and Yue Zhang. 2023. Measuring Consistency in Text-based Financial Forecasting Models. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 13751–13765, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Measuring Consistency in Text-based Financial Forecasting Models (Yang et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.769.pdf
Video:
 https://aclanthology.org/2023.acl-long.769.mp4