Jonas Pfeiffer
I am a Research Scientist at Google Research working on Natural Language Processing, interested in modular representation learning in multi-task, multilingual, and multi-modal contexts, and in low-resource scenarios.
I am one of the main contributors of the AdapterHub.ml framework, which makes it very easy to add and train new parameters within pre-trained transformer-based language models.
news
Aug 22, 2022 | I have started as a Research Scientist at Google Research in Zürich |
---|---|
May 15, 2022 | Our paper IGLUE: A Benchmark for Transfer Learning across Modalities, Tasks, and Languages was accepted at ICML 2022. |
Apr 7, 2022 | I am happy to announce that my paper Lifting the Curse of Multilinguality by Pre-training Modular Transformers was accepted at NAACL 2022 |
Feb 21, 2022 |
I am happy to announce that 4 of my papers were accepted at ACL/TACL 2022:
|
Jan 24, 2022 | I am excited to announce that I have joined the New York University where I will be working with Kyunghyun Cho during my 6-Month research visit. |