Neuralizing Regular Expressions for Slot Filling

Chengyue Jiang, Zijian Jin, Kewei Tu


Abstract
Neural models and symbolic rules such as regular expressions have their respective merits and weaknesses. In this paper, we study the integration of the two approaches for the slot filling task by converting regular expressions into neural networks. Specifically, we first convert regular expressions into a special form of finite-state transducers, then unfold its approximate inference algorithm as a bidirectional recurrent neural model that performs slot filling via sequence labeling. Experimental results show that our model has superior zero-shot and few-shot performance and stays competitive when there are sufficient training data.
Anthology ID:
2021.emnlp-main.747
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9481–9498
Language:
URL:
https://aclanthology.org/2021.emnlp-main.747
DOI:
10.18653/v1/2021.emnlp-main.747
Bibkey:
Cite (ACL):
Chengyue Jiang, Zijian Jin, and Kewei Tu. 2021. Neuralizing Regular Expressions for Slot Filling. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 9481–9498, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Neuralizing Regular Expressions for Slot Filling (Jiang et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.747.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.747.mp4
Data
ATIS