Meta-learning for fast cross-lingual adaptation in dependency parsing

We apply model-agnostic meta-learning (MAML) to the task of cross-lingual dependency parsing and find that meta-learning with pre-training can significantly improve performance for a variety of unseen, typologically diverse, and low-resource languages, in a few-shot learning setup.

A Multimodal Framework for the Detection of Hateful Memes

We compare various multimodal transformer architectures for the task of detecting hateful memes, and achieve 4th place on the Hateful Meme Challenge organized by Facebook AI at NeurIPS 2020.

Diversifying Task-oriented Dialogue Response Generation with Prototype Guided Paraphrasing

We propose to combine the merits of template-based and corpus-based dialogue response generation by introducing a prototype-based, paraphrasing neural network, called P2-Net, which aims to enhance quality of the responses in terms of both precision and diversity.