William Headden
2020
Learn to Cross-lingual Transfer with Meta Graph Learning Across Heterogeneous Languages
Zheng Li
|
Mukul Kumar
|
William Headden
|
Bing Yin
|
Ying Wei
|
Yu Zhang
|
Qiang Yang
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Recent emergence of multilingual pre-training language model (mPLM) has enabled breakthroughs on various downstream cross-lingual transfer (CLT) tasks. However, mPLM-based methods usually involve two problems: (1) simply fine-tuning may not adapt general-purpose multilingual representations to be task-aware on low-resource languages; (2) ignore how cross-lingual adaptation happens for downstream tasks. To address the issues, we propose a meta graph learning (MGL) method. Unlike prior works that transfer from scratch, MGL can learn to cross-lingual transfer by extracting meta-knowledge from historical CLT experiences (tasks), making mPLM insensitive to low-resource languages. Besides, for each CLT task, MGL formulates its transfer process as information propagation over a dynamic graph, where the geometric structure can automatically capture intrinsic language relationships to explicitly guide cross-lingual transfer. Empirically, extensive experiments on both public and real-world datasets demonstrate the effectiveness of the MGL method.
2012
A Phrase-Discovering Topic Model Using Hierarchical Pitman-Yor Processes
Robert Lindsey
|
William Headden
|
Michael Stipicevic
Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning
Search
Co-authors
- Robert Lindsey 1
- Michael Stipicevic 1
- Zheng Li 1
- Mukul Kumar 1
- Bing Yin 1
- show all...