Essentia: Mining Domain-specific Paraphrases with Word-Alignment Graphs

Danni Ma, Chen Chen, Behzad Golshan, Wang-Chiew Tan


Abstract
Paraphrases are important linguistic resources for a wide variety of NLP applications. Many techniques for automatic paraphrase mining from general corpora have been proposed. While these techniques are successful at discovering generic paraphrases, they often fail to identify domain-specific paraphrases (e.g., staff, concierge in the hospitality domain). This is because current techniques are often based on statistical methods, while domain-specific corpora are too small to fit statistical methods. In this paper, we present an unsupervised graph-based technique to mine paraphrases from a small set of sentences that roughly share the same topic or intent. Our system, Essentia, relies on word-alignment techniques to create a word-alignment graph that merges and organizes tokens from input sentences. The resulting graph is then used to generate candidate paraphrases. We demonstrate that our system obtains high quality paraphrases, as evaluated by crowd workers. We further show that the majority of the identified paraphrases are domain-specific and thus complement existing paraphrase databases.
Anthology ID:
D19-5307
Volume:
Proceedings of the Thirteenth Workshop on Graph-Based Methods for Natural Language Processing (TextGraphs-13)
Month:
November
Year:
2019
Address:
Hong Kong
Venues:
EMNLP | TextGraphs | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
52–57
Language:
URL:
https://aclanthology.org/D19-5307
DOI:
10.18653/v1/D19-5307
Bibkey:
Cite (ACL):
Danni Ma, Chen Chen, Behzad Golshan, and Wang-Chiew Tan. 2019. Essentia: Mining Domain-specific Paraphrases with Word-Alignment Graphs. In Proceedings of the Thirteenth Workshop on Graph-Based Methods for Natural Language Processing (TextGraphs-13), pages 52–57, Hong Kong. Association for Computational Linguistics.
Cite (Informal):
Essentia: Mining Domain-specific Paraphrases with Word-Alignment Graphs (Ma et al., EMNLP 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-5307.pdf
Data
SNIPS