%0 Conference Proceedings %T A Unified Encoding of Structures in Transition Systems %A Ji, Tao %A Jiang, Yong %A Wang, Tao %A Huang, Zhongqiang %A Huang, Fei %A Wu, Yuanbin %A Wang, Xiaoling %Y Moens, Marie-Francine %Y Huang, Xuanjing %Y Specia, Lucia %Y Yih, Scott Wen-tau %S Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing %D 2021 %8 November %I Association for Computational Linguistics %C Online and Punta Cana, Dominican Republic %F ji-etal-2021-unified %X Transition systems usually contain various dynamic structures (e.g., stacks, buffers). An ideal transition-based model should encode these structures completely and efficiently. Previous works relying on templates or neural network structures either only encode partial structure information or suffer from computation efficiency. In this paper, we propose a novel attention-based encoder unifying representation of all structures in a transition system. Specifically, we separate two views of items on structures, namely structure-invariant view and structure-dependent view. With the help of parallel-friendly attention network, we are able to encoding transition states with O(1) additional complexity (with respect to basic feature extractors). Experiments on the PTB and UD show that our proposed method significantly improves the test speed and achieves the best transition-based model, and is comparable to state-of-the-art methods. %R 10.18653/v1/2021.emnlp-main.339 %U https://aclanthology.org/2021.emnlp-main.339 %U https://doi.org/10.18653/v1/2021.emnlp-main.339 %P 4121-4133