Simple Attention-Based Representation Learning for Ranking Short Social Media Posts

Peng Shi, Jinfeng Rao, Jimmy Lin


Abstract
This paper explores the problem of ranking short social media posts with respect to user queries using neural networks. Instead of starting with a complex architecture, we proceed from the bottom up and examine the effectiveness of a simple, word-level Siamese architecture augmented with attention-based mechanisms for capturing semantic “soft” matches between query and post tokens. Extensive experiments on datasets from the TREC Microblog Tracks show that our simple models not only achieve better effectiveness than existing approaches that are far more complex or exploit a more diverse set of relevance signals, but are also much faster.
Anthology ID:
N19-1229
Volume:
Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota
Editors:
Jill Burstein, Christy Doran, Thamar Solorio
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2212–2217
Language:
URL:
https://aclanthology.org/N19-1229
DOI:
10.18653/v1/N19-1229
Bibkey:
Cite (ACL):
Peng Shi, Jinfeng Rao, and Jimmy Lin. 2019. Simple Attention-Based Representation Learning for Ranking Short Social Media Posts. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 2212–2217, Minneapolis, Minnesota. Association for Computational Linguistics.
Cite (Informal):
Simple Attention-Based Representation Learning for Ranking Short Social Media Posts (Shi et al., NAACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/N19-1229.pdf