Lin: Unsupervised Extraction of Tasks from Textual Communication

Parth Diwanji, Hui Guo, Munindar Singh, Anup Kalia


Abstract
Commitments and requests are a hallmark of collaborative communication, especially in team settings. Identifying specific tasks being committed to or request from emails and chat messages can enable important downstream tasks, such as producing todo lists, reminders, and calendar entries. State-of-the-art approaches for task identification rely on large annotated datasets, which are not always available, especially for domain-specific tasks. Accordingly, we propose Lin, an unsupervised approach of identifying tasks that leverages dependency parsing and VerbNet. Our evaluations show that Lin yields comparable or more accurate results than supervised models on domains with large training sets, and maintains its excellent performance on unseen domains.
Anthology ID:
2020.coling-main.164
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
1815–1819
Language:
URL:
https://aclanthology.org/2020.coling-main.164
DOI:
10.18653/v1/2020.coling-main.164
Bibkey:
Cite (ACL):
Parth Diwanji, Hui Guo, Munindar Singh, and Anup Kalia. 2020. Lin: Unsupervised Extraction of Tasks from Textual Communication. In Proceedings of the 28th International Conference on Computational Linguistics, pages 1815–1819, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Lin: Unsupervised Extraction of Tasks from Textual Communication (Diwanji et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.164.pdf
Code
 parth27/lin