Model-Free Context-Aware Word Composition

Bo An, Xianpei Han, Le Sun


Abstract
Word composition is a promising technique for representation learning of large linguistic units (e.g., phrases, sentences and documents). However, most of the current composition models do not take the ambiguity of words and the context outside of a linguistic unit into consideration for learning representations, and consequently suffer from the inaccurate representation of semantics. To address this issue, we propose a model-free context-aware word composition model, which employs the latent semantic information as global context for learning representations. The proposed model attempts to resolve the word sense disambiguation and word composition in a unified framework. Extensive evaluation shows consistent improvements over various strong word representation/composition models at different granularities (including word, phrase and sentence), demonstrating the effectiveness of our proposed method.
Anthology ID:
C18-1240
Volume:
Proceedings of the 27th International Conference on Computational Linguistics
Month:
August
Year:
2018
Address:
Santa Fe, New Mexico, USA
Editors:
Emily M. Bender, Leon Derczynski, Pierre Isabelle
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2834–2845
Language:
URL:
https://aclanthology.org/C18-1240
DOI:
Bibkey:
Cite (ACL):
Bo An, Xianpei Han, and Le Sun. 2018. Model-Free Context-Aware Word Composition. In Proceedings of the 27th International Conference on Computational Linguistics, pages 2834–2845, Santa Fe, New Mexico, USA. Association for Computational Linguistics.
Cite (Informal):
Model-Free Context-Aware Word Composition (An et al., COLING 2018)
Copy Citation:
PDF:
https://aclanthology.org/C18-1240.pdf