Recovering Lexically and Semantically Reused Texts

Ansel MacLaughlin, Shaobin Xu, David A. Smith


Abstract
Writers often repurpose material from existing texts when composing new documents. Because most documents have more than one source, we cannot trace these connections using only models of document-level similarity. Instead, this paper considers methods for local text reuse detection (LTRD), detecting localized regions of lexically or semantically similar text embedded in otherwise unrelated material. In extensive experiments, we study the relative performance of four classes of neural and bag-of-words models on three LTRD tasks – detecting plagiarism, modeling journalists’ use of press releases, and identifying scientists’ citation of earlier papers. We conduct evaluations on three existing datasets and a new, publicly-available citation localization dataset. Our findings shed light on a number of previously-unexplored questions in the study of LTRD, including the importance of incorporating document-level context for predictions, the applicability of of-the-shelf neural models pretrained on “general” semantic textual similarity tasks such as paraphrase detection, and the trade-offs between more efficient bag-of-words and feature-based neural models and slower pairwise neural models.
Anthology ID:
2021.starsem-1.5
Volume:
Proceedings of *SEM 2021: The Tenth Joint Conference on Lexical and Computational Semantics
Month:
August
Year:
2021
Address:
Online
Editors:
Lun-Wei Ku, Vivi Nastase, Ivan Vulić
Venue:
*SEM
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
52–66
Language:
URL:
https://aclanthology.org/2021.starsem-1.5
DOI:
10.18653/v1/2021.starsem-1.5
Bibkey:
Cite (ACL):
Ansel MacLaughlin, Shaobin Xu, and David A. Smith. 2021. Recovering Lexically and Semantically Reused Texts. In Proceedings of *SEM 2021: The Tenth Joint Conference on Lexical and Computational Semantics, pages 52–66, Online. Association for Computational Linguistics.
Cite (Informal):
Recovering Lexically and Semantically Reused Texts (MacLaughlin et al., *SEM 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.starsem-1.5.pdf
Code
 maclaughlin/arc-sim
Data
S2ORC