Maksim Savkin
2024
DeepPavlov 1.0: Your Gateway to Advanced NLP Models Backed by Transformers and Transfer Learning
Maksim Savkin
|
Anastasia Voznyuk
|
Fedor Ignatov
|
Anna Korzanova
|
Dmitry Karpov
|
Alexander Popov
|
Vasily Konovalov
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing: System Demonstrations
We present DeepPavlov 1.0, an open-source framework for using Natural Language Processing (NLP) models by leveraging transfer learning techniques. DeepPavlov 1.0 is created for modular and configuration-driven development of state-of-the-art NLP models and supports a wide range of NLP model applications. DeepPavlov 1.0 is designed for practitioners with limited knowledge of NLP/ML. DeepPavlov is based on PyTorch and supports HuggingFace transformers. DeepPavlov is publicly released under the Apache 2.0 license and provides access to an online demo.
Search
Co-authors
- Anastasia Voznyuk 1
- Fedor Ignatov 1
- Anna Korzanova 1
- Dmitry Karpov 1
- Alexander Popov 1
- show all...