Jonas Pfeiffer

Technical University of Darmstadt

I am a 3rd year PhD student working on Natural Language Processing.

I am interested in modular transfer learning where I currently find Adapters cool to work with. My focus is on multilingual and multimodal transfer, where I am especially interested in very low resource languages.

I am one of the main contributors of the AdapterHub.ml framework, which makes it very easy to add and train new parameters within pre-trained transformer-based language models.

news

Jun 7, 2021 I have started as a Research Scientist Intern at Facebook AI Research (FAIR) in London.
May 6, 2021 Our paper How Good is Your Tokenizer? On the Monolingual Performance of Multilingual Language Models was accepted to ACL 2021.
Jan 11, 2021 Our paper AdapterFusion: Non-Destructive Task Composition for Transfer Learning was accepted to EACL 2021.
Jan 1, 2020 I received the IBM PhD Fellowship award 2020.

selected publications

  1. UNKs Everywhere: Adapting Multilingual Language Models to New Scripts
    arXiv preprint 2021
  2. AdapterFusion: Non-Destructive Task Composition for Transfer Learning
    In Proceedings of EACL 2021
  3. MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer
    In Proceedings of EMNLP 2020
  4. AdapterHub: A Framework for Adapting Transformers
    In Proceedings of EMNLP - System Demonstrations 2020