Reward-based Input Construction for Cross-document Relation Extraction

Byeonghu Na, Suhyeon Jo, Yeongmin Kim, Il-chul Moon


Abstract
Relation extraction (RE) is a fundamental task in natural language processing, aiming to identify relations between target entities in text. While many RE methods are designed for a single sentence or document, cross-document RE has emerged to address relations across multiple long documents. Given the nature of long documents in cross-document RE, extracting document embeddings is challenging due to the length constraints of pre-trained language models. Therefore, we propose REward-based Input Construction (REIC), the first learning-based sentence selector for cross-document RE. REIC extracts sentences based on relational evidence, enabling the RE module to effectively infer relations. Since supervision of evidence sentences is generally unavailable, we train REIC using reinforcement learning with RE prediction scores as rewards. Experimental results demonstrate the superiority of our method over heuristic methods for different RE structures and backbones in cross-document RE. Our code is publicly available at https://github.com/aailabkaist/REIC.
Anthology ID:
2024.acl-long.501
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9254–9270
Language:
URL:
https://aclanthology.org/2024.acl-long.501
DOI:
Bibkey:
Cite (ACL):
Byeonghu Na, Suhyeon Jo, Yeongmin Kim, and Il-chul Moon. 2024. Reward-based Input Construction for Cross-document Relation Extraction. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 9254–9270, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Reward-based Input Construction for Cross-document Relation Extraction (Na et al., ACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.acl-long.501.pdf