Using a Penalty-based Loss Re-estimation Method to Improve Implicit Discourse Relation Classification

Xiao Li, Yu Hong, Huibin Ruan, Zhen Huang


Abstract
We tackle implicit discourse relation classification, a task of automatically determining semantic relationships between arguments. The attention-worthy words in arguments are crucial clues for classifying the discourse relations. Attention mechanisms have been proven effective in highlighting the attention-worthy words during encoding. However, our survey shows that some inessential words are unintentionally misjudged as the attention-worthy words and, therefore, assigned heavier attention weights than should be. We propose a penalty-based loss re-estimation method to regulate the attention learning process, integrating penalty coefficients into the computation of loss by means of overstability of attention weight distributions. We conduct experiments on the Penn Discourse TreeBank (PDTB) corpus. The test results show that our loss re-estimation method leads to substantial improvements for a variety of attention mechanisms, and it obtains highly competitive performance compared to the state-of-the-art methods.
Anthology ID:
2020.coling-main.132
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
1513–1518
Language:
URL:
https://aclanthology.org/2020.coling-main.132
DOI:
10.18653/v1/2020.coling-main.132
Bibkey:
Cite (ACL):
Xiao Li, Yu Hong, Huibin Ruan, and Zhen Huang. 2020. Using a Penalty-based Loss Re-estimation Method to Improve Implicit Discourse Relation Classification. In Proceedings of the 28th International Conference on Computational Linguistics, pages 1513–1518, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Using a Penalty-based Loss Re-estimation Method to Improve Implicit Discourse Relation Classification (Li et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.132.pdf