Jonas Pfeiffer

Google Research

jonas_img_circle.png

I am a Research Scientist at Google Research working on Natural Language Processing, interested in modular representation learning in multi-task, multilingual, and multi-modal contexts, and in low-resource scenarios.

I am one of the main contributors of the AdapterHub.ml framework, which makes it very easy to add and train new parameters within pre-trained transformer-based language models.

news

Aug 22, 2022 I have started as a Research Scientist at Google Research in Zürich
May 15, 2022 Our paper IGLUE: A Benchmark for Transfer Learning across Modalities, Tasks, and Languages was accepted at ICML 2022.
Apr 7, 2022 I am happy to announce that my paper Lifting the Curse of Multilinguality by Pre-training Modular Transformers was accepted at NAACL 2022
Feb 21, 2022 I am happy to announce that 4 of my papers were accepted at ACL/TACL 2022:
Jan 24, 2022 I am excited to announce that I have joined the New York University where I will be working with Kyunghyun Cho during my 6-Month research visit.

selected publications

  1. Lifting the Curse of Multilinguality by Pre-training Modular Transformers
    In NAACL 2022
  2. UNKs Everywhere: Adapting Multilingual Language Models to New Scripts
    Pfeiffer, Jonas, Vulić, Ivan, Gurevych, Iryna, and Ruder, Sebastian
    In Proceedings of EMNLP 2021
  3. AdapterFusion: Non-Destructive Task Composition for Transfer Learning
    In Proceedings of EACL 2021
  4. MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer
    Pfeiffer, Jonas, Vulić, Ivan, Gurevych, Iryna, and Ruder, Sebastian
    In Proceedings of EMNLP 2020
  5. AdapterHub: A Framework for Adapting Transformers
    In Proceedings of EMNLP - System Demonstrations 2020