I am a 3rd year PhD student working on Natural Language Processing.
I am interested in modular transfer learning where I currently find Adapters cool to work with. My focus is on multilingual and multimodal transfer, where I am especially interested in very low resource languages.
I am one of the main contributors of the AdapterHub.ml framework, which makes it very easy to add and train new parameters within pre-trained transformer-based language models.
|Jun 7, 2021||I have started as a Research Scientist Intern at Facebook AI Research (FAIR) in London.|
|May 6, 2021||Our paper How Good is Your Tokenizer? On the Monolingual Performance of Multilingual Language Models was accepted to ACL 2021.|
|Jan 11, 2021||Our paper AdapterFusion: Non-Destructive Task Composition for Transfer Learning was accepted to EACL 2021.|
|Jan 1, 2020||I received the IBM PhD Fellowship award 2020.|
- UNKs Everywhere: Adapting Multilingual Language Models to New ScriptsarXiv preprint 2021
- AdapterFusion: Non-Destructive Task Composition for Transfer LearningIn Proceedings of EACL 2021
- MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual TransferIn Proceedings of EMNLP 2020
- AdapterHub: A Framework for Adapting TransformersIn Proceedings of EMNLP - System Demonstrations 2020