Local-to-global learning for iterative training of production SLU models on new features

Yulia Grishina, Daniil Sorokin


Abstract
In production SLU systems, new training data becomes available with time so that ML models need to be updated on a regular basis. Specifically, releasing new features adds new classes of data while the old data remains constant. However, retraining the full model each time from scratch is computationally expensive. To address this problem, we propose to consider production releases from the curriculum learning perspective and to adapt the local-to-global learning (LGL) schedule (Cheng et. al, 2019) for a statistical model that starts with fewer output classes and adds more classes with each iteration. We report experiments for the tasks of intent classification and slot filling in the context of a production voice-assistant. First, we apply the original LGL schedule on our data and then adapt LGL to the production setting where the full data is not available at initial training iterations. We demonstrate that our method improves model error rates by 7.3% and saves up to 25% training time for individual iterations.
Anthology ID:
2022.naacl-industry.13
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Industry Track
Month:
July
Year:
2022
Address:
Hybrid: Seattle, Washington + Online
Editors:
Anastassia Loukina, Rashmi Gangadharaiah, Bonan Min
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
103–111
Language:
URL:
https://aclanthology.org/2022.naacl-industry.13
DOI:
10.18653/v1/2022.naacl-industry.13
Bibkey:
Cite (ACL):
Yulia Grishina and Daniil Sorokin. 2022. Local-to-global learning for iterative training of production SLU models on new features. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Industry Track, pages 103–111, Hybrid: Seattle, Washington + Online. Association for Computational Linguistics.
Cite (Informal):
Local-to-global learning for iterative training of production SLU models on new features (Grishina & Sorokin, NAACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.naacl-industry.13.pdf
Video:
 https://aclanthology.org/2022.naacl-industry.13.mp4