Anton Bakalov
2018
A Fast, Compact, Accurate Model for Language Identification of Codemixed Text
Yuan Zhang
|
Jason Riesa
|
Daniel Gillick
|
Anton Bakalov
|
Jason Baldridge
|
David Weiss
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
We address fine-grained multilingual language identification: providing a language code for every token in a sentence, including codemixed text containing multiple languages. Such text is prevalent online, in documents, social media, and message boards. We show that a feed-forward network with a simple globally constrained decoder can accurately and rapidly label both codemixed and monolingual text in 100 languages and 100 language pairs. This model outperforms previously published multilingual approaches in terms of both accuracy and speed, yielding an 800x speed-up and a 19.5% averaged absolute gain on three codemixed datasets. It furthermore outperforms several benchmark systems on monolingual language identification.
2017
Natural Language Processing with Small Feed-Forward Networks
Jan A. Botha
|
Emily Pitler
|
Ji Ma
|
Anton Bakalov
|
Alex Salcianu
|
David Weiss
|
Ryan McDonald
|
Slav Petrov
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
We show that small and shallow feed-forward neural networks can achieve near state-of-the-art results on a range of unstructured and structured language processing tasks while being considerably cheaper in memory and computational requirements than deep recurrent models. Motivated by resource-constrained environments like mobile phones, we showcase simple techniques for obtaining such small neural network models, and investigate different tradeoffs when deciding how to allocate a small memory budget.
Search
Co-authors
- David Weiss 2
- Jan A. Botha 1
- Emily Pitler 1
- Ji Ma 1
- Alex Salcianu 1
- show all...