Multi-lingual NLP

Meta-learning for fast cross-lingual adaptation in dependency parsing

We apply model-agnostic meta-learning (MAML) to the task of cross-lingual dependency parsing and find that meta-learning with pre-training can significantly improve performance for a variety of unseen, typologically diverse, and low-resource languages, in a few-shot learning setup.