AutoRC: Improving BERT Based Relation Classification Models via Architecture Search

Wei Zhu


Abstract
Although BERT based relation classification (RC) models have achieved significant improvements over the traditional deep learning models, it seems that no consensus can be reached on what is the optimal architecture, since there are many design choices available. In this work, we design a comprehensive search space for BERT based RC models and employ a modified version of efficient neural architecture search (ENAS) method to automatically discover the design choices mentioned above. Experiments on eight benchmark RC tasks show that our method is efficient and effective in finding better architectures than the baseline BERT based RC models. Ablation study demonstrates the necessity of our search space design and the effectiveness of our search method. We also show that our framework can also apply to other entity related tasks like coreference resolution and span based named entity recognition (NER).
Anthology ID:
2021.acl-srw.4
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing: Student Research Workshop
Month:
August
Year:
2021
Address:
Online
Editors:
Jad Kabbara, Haitao Lin, Amandalynne Paullada, Jannis Vamvas
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
33–43
Language:
URL:
https://aclanthology.org/2021.acl-srw.4
DOI:
10.18653/v1/2021.acl-srw.4
Bibkey:
Cite (ACL):
Wei Zhu. 2021. AutoRC: Improving BERT Based Relation Classification Models via Architecture Search. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing: Student Research Workshop, pages 33–43, Online. Association for Computational Linguistics.
Cite (Informal):
AutoRC: Improving BERT Based Relation Classification Models via Architecture Search (Zhu, ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.acl-srw.4.pdf
Video:
 https://aclanthology.org/2021.acl-srw.4.mp4