Augmenting BERT-style Models with Predictive Coding to Improve Discourse-level Representations

Vladimir Araujo, Andrés Villa, Marcelo Mendoza, Marie-Francine Moens, Alvaro Soto


Abstract
Current language models are usually trained using a self-supervised scheme, where the main focus is learning representations at the word or sentence level. However, there has been limited progress in generating useful discourse-level representations. In this work, we propose to use ideas from predictive coding theory to augment BERT-style language models with a mechanism that allows them to learn suitable discourse-level representations. As a result, our proposed approach is able to predict future sentences using explicit top-down connections that operate at the intermediate layers of the network. By experimenting with benchmarks designed to evaluate discourse-related knowledge using pre-trained sentence representations, we demonstrate that our approach improves performance in 6 out of 11 tasks by excelling in discourse relationship detection.
Anthology ID:
2021.emnlp-main.240
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3016–3022
Language:
URL:
https://aclanthology.org/2021.emnlp-main.240
DOI:
10.18653/v1/2021.emnlp-main.240
Bibkey:
Cite (ACL):
Vladimir Araujo, Andrés Villa, Marcelo Mendoza, Marie-Francine Moens, and Alvaro Soto. 2021. Augmenting BERT-style Models with Predictive Coding to Improve Discourse-level Representations. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 3016–3022, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Augmenting BERT-style Models with Predictive Coding to Improve Discourse-level Representations (Araujo et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.240.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.240.mp4