Competing Independent Modules for Knowledge Integration and Optimization

Parsa Bagherzadeh, Sabine Bergler


Abstract
This paper presents a neural framework of untied independent modules, used here for integrating off the shelf knowledge sources such as language models, lexica, POS information, and dependency relations. Each knowledge source is implemented as an independent component that can interact and share information with other knowledge sources. We report proof of concept experiments for several standard sentiment analysis tasks and show that the knowledge sources interoperate effectively without interference. As a second use-case, we show that the proposed framework is suitable for optimizing BERT-like language models even without the help of external knowledge sources. We cast each Transformer layer as a separate module and demonstrate performance improvements from this explicit integration of the different information encoded at the different Transformer layers .
Anthology ID:
2021.findings-emnlp.376
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
4416–4425
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.376
DOI:
10.18653/v1/2021.findings-emnlp.376
Bibkey:
Cite (ACL):
Parsa Bagherzadeh and Sabine Bergler. 2021. Competing Independent Modules for Knowledge Integration and Optimization. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 4416–4425, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Competing Independent Modules for Knowledge Integration and Optimization (Bagherzadeh & Bergler, Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.376.pdf
Video:
 https://aclanthology.org/2021.findings-emnlp.376.mp4
Data
GLUESSTSST-2