HyperT5: Towards Compute-Efficient Korean Language Modeling

Dongju Park, Soonwon Ka, Kang Min Yoo, Gichang Lee, Jaewook Kang


Abstract
Pretraining and fine-tuning language models have become the standard practice in industrial natural language processing (NLP), but developing and deploying general-purpose language models without the abundant computation or data resources is a real-world issue faced by smaller organizations or communities whose main focus is languages with less accessible resources (e.g., non-English). This paper explores the sequence-to-sequence (seq2seq) language model architecture as a more practical and compute-efficient alternative to the decoder-oriented approach (e.g., GPT-3), accompanied by novel findings in compute-optimality analyses. We successfully trained billion-scale Korean-language seq2seq language models that strongly outperform other competitive models in Korean benchmarks. Moreover, we demonstrate that such language models can be more efficiently utilized by employing a heavy pre-finetuning strategy, by showcasing a case study on dialog-task adaptation. Our case study shows that adopting language models with more readily available domain-specific unlabeled data greatly improves fine-tuning data efficiency in low-resource settings.
Anthology ID:
2023.acl-industry.40
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 5: Industry Track)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Sunayana Sitaram, Beata Beigman Klebanov, Jason D Williams
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
412–424
Language:
URL:
https://aclanthology.org/2023.acl-industry.40
DOI:
10.18653/v1/2023.acl-industry.40
Bibkey:
Cite (ACL):
Dongju Park, Soonwon Ka, Kang Min Yoo, Gichang Lee, and Jaewook Kang. 2023. HyperT5: Towards Compute-Efficient Korean Language Modeling. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 5: Industry Track), pages 412–424, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
HyperT5: Towards Compute-Efficient Korean Language Modeling (Park et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-industry.40.pdf