A Unified Encoding of Structures in Transition Systems

Tao Ji, Yong Jiang, Tao Wang, Zhongqiang Huang, Fei Huang, Yuanbin Wu, Xiaoling Wang


Abstract
Transition systems usually contain various dynamic structures (e.g., stacks, buffers). An ideal transition-based model should encode these structures completely and efficiently. Previous works relying on templates or neural network structures either only encode partial structure information or suffer from computation efficiency. In this paper, we propose a novel attention-based encoder unifying representation of all structures in a transition system. Specifically, we separate two views of items on structures, namely structure-invariant view and structure-dependent view. With the help of parallel-friendly attention network, we are able to encoding transition states with O(1) additional complexity (with respect to basic feature extractors). Experiments on the PTB and UD show that our proposed method significantly improves the test speed and achieves the best transition-based model, and is comparable to state-of-the-art methods.
Anthology ID:
2021.emnlp-main.339
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4121–4133
Language:
URL:
https://aclanthology.org/2021.emnlp-main.339
DOI:
10.18653/v1/2021.emnlp-main.339
Bibkey:
Cite (ACL):
Tao Ji, Yong Jiang, Tao Wang, Zhongqiang Huang, Fei Huang, Yuanbin Wu, and Xiaoling Wang. 2021. A Unified Encoding of Structures in Transition Systems. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 4121–4133, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
A Unified Encoding of Structures in Transition Systems (Ji et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.339.pdf
Software:
 2021.emnlp-main.339.Software.zip