SLATE: A Sequence Labeling Approach for Task Extraction from Free-form Inked Content

Apurva Gandhi, Ryan Serrao, Biyi Fang, Gilbert Antonius, Jenna Hong, Tra My Nguyen, Sheng Yi, Ehi Nosakhare, Irene Shaffer, Soundararajan Srinivasan


Abstract
We present SLATE, a sequence labeling approach for extracting tasks from free-form content such as digitally handwritten (or “inked”) notes on a virtual whiteboard. Our approach allows us to create a single, low-latency model to simultaneously perform sentence segmentation and classification of these sentences into task/non-task sentences. SLATE greatly outperforms a baseline two-model (sentence segmentation followed by classification model) approach, achieving a task F1 score of 84.4%, a sentence segmentation (boundary similarity) score of 88.4% and three times lower latency compared to the baseline. Furthermore, we provide insights into tackling challenges of performing NLP on the inking domain. We release both our code and dataset for this novel task.
Anthology ID:
2022.emnlp-industry.21
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing: Industry Track
Month:
December
Year:
2022
Address:
Abu Dhabi, UAE
Editors:
Yunyao Li, Angeliki Lazaridou
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
206–217
Language:
URL:
https://aclanthology.org/2022.emnlp-industry.21
DOI:
10.18653/v1/2022.emnlp-industry.21
Bibkey:
Cite (ACL):
Apurva Gandhi, Ryan Serrao, Biyi Fang, Gilbert Antonius, Jenna Hong, Tra My Nguyen, Sheng Yi, Ehi Nosakhare, Irene Shaffer, and Soundararajan Srinivasan. 2022. SLATE: A Sequence Labeling Approach for Task Extraction from Free-form Inked Content. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing: Industry Track, pages 206–217, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
SLATE: A Sequence Labeling Approach for Task Extraction from Free-form Inked Content (Gandhi et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-industry.21.pdf