Exploring Semantic Capacity of Terms

Jie Huang, Zilong Wang, Kevin Chang, Wen-mei Hwu, JinJun Xiong


Abstract
We introduce and study semantic capacity of terms. For example, the semantic capacity of artificial intelligence is higher than that of linear regression since artificial intelligence possesses a broader meaning scope. Understanding semantic capacity of terms will help many downstream tasks in natural language processing. For this purpose, we propose a two-step model to investigate semantic capacity of terms, which takes a large text corpus as input and can evaluate semantic capacity of terms if the text corpus can provide enough co-occurrence information of terms. Extensive experiments in three fields demonstrate the effectiveness and rationality of our model compared with well-designed baselines and human-level evaluations.
Anthology ID:
2020.emnlp-main.684
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Editors:
Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8509–8518
Language:
URL:
https://aclanthology.org/2020.emnlp-main.684
DOI:
10.18653/v1/2020.emnlp-main.684
Bibkey:
Cite (ACL):
Jie Huang, Zilong Wang, Kevin Chang, Wen-mei Hwu, and JinJun Xiong. 2020. Exploring Semantic Capacity of Terms. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 8509–8518, Online. Association for Computational Linguistics.
Cite (Informal):
Exploring Semantic Capacity of Terms (Huang et al., EMNLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.emnlp-main.684.pdf
Video:
 https://slideslive.com/38938735
Code
 c3sr/semantic-capacity