Hierarchical Phrase-Based Sequence-to-Sequence Learning

Bailin Wang, Ivan Titov, Jacob Andreas, Yoon Kim


Abstract
This paper describes a neural transducer that maintains the flexibility of standard sequence-to-sequence (seq2seq) models while incorporating hierarchical phrases as a source of inductive bias during training and as explicit constraints during inference. Our approach trains two models: a discriminative parser based on a bracketing transduction grammar whose derivation tree hierarchically aligns source and target phrases, and a neural seq2seq model that learns to translate the aligned phrases one-by-one. We use the same seq2seq model to translate at all phrase scales, which results in two inference modes: one mode in which the parser is discarded and only the seq2seq component is used at the sequence-level, and another in which the parser is combined with the seq2seq model. Decoding in the latter mode is done with the cube-pruned CKY algorithm, which is more involved but can make use of new translation rules during inference. We formalize our model as a source-conditioned synchronous grammar and develop an efficient variational inference algorithm for training. When applied on top of both randomly initialized and pretrained seq2seq models, we find that it performs well compared to baselines on small scale machine translation benchmarks.
Anthology ID:
2022.emnlp-main.563
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8211–8229
Language:
URL:
https://aclanthology.org/2022.emnlp-main.563
DOI:
10.18653/v1/2022.emnlp-main.563
Bibkey:
Cite (ACL):
Bailin Wang, Ivan Titov, Jacob Andreas, and Yoon Kim. 2022. Hierarchical Phrase-Based Sequence-to-Sequence Learning. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 8211–8229, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Hierarchical Phrase-Based Sequence-to-Sequence Learning (Wang et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.563.pdf