Template-aware Attention Model for Earnings Call Report Generation

Yangchen Huang, Prashant K. Dhingra, Seyed Danial Mohseni Taheri


Abstract
Earning calls are among important resources for investors and analysts for updating their price targets. Firms usually publish corresponding transcripts soon after earnings events. However, raw transcripts are often too long and miss the coherent structure. To enhance the clarity, analysts write well-structured reports for some important earnings call events by analyzing them, requiring time and effort. In this paper, we propose TATSum (Template-Aware aTtention model for Summarization), a generalized neural summarization approach for structured report generation, and evaluate its performance in the earnings call domain. We build a large corpus with thousands of transcripts and reports using historical earnings events. We first generate a candidate set of reports from the corpus as potential soft templates which do not impose actual rules on the output. Then, we employ an encoder model with margin-ranking loss to rank the candidate set and select the best quality template. Finally, the transcript and the selected soft template are used as input in a seq2seq framework for report generation. Empirical results on the earnings call dataset show that our model significantly outperforms state-of-the-art models in terms of informativeness and structure.
Anthology ID:
2021.newsum-1.2
Volume:
Proceedings of the Third Workshop on New Frontiers in Summarization
Month:
November
Year:
2021
Address:
Online and in Dominican Republic
Editors:
Giuseppe Carenini, Jackie Chi Kit Cheung, Yue Dong, Fei Liu, Lu Wang
Venue:
NewSum
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15–24
Language:
URL:
https://aclanthology.org/2021.newsum-1.2
DOI:
10.18653/v1/2021.newsum-1.2
Bibkey:
Cite (ACL):
Yangchen Huang, Prashant K. Dhingra, and Seyed Danial Mohseni Taheri. 2021. Template-aware Attention Model for Earnings Call Report Generation. In Proceedings of the Third Workshop on New Frontiers in Summarization, pages 15–24, Online and in Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Template-aware Attention Model for Earnings Call Report Generation (Huang et al., NewSum 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.newsum-1.2.pdf
Video:
 https://aclanthology.org/2021.newsum-1.2.mp4
Data
CNN/Daily Mail