You can manage bookmarks using lists, please log in to your user account for this.
Media type:
E-Article
Title:
Parameter Space Factorization for Zero-Shot Learning across Tasks and Languages
Contributor:
Ponti, Edoardo M.;
Vulić, Ivan;
Cotterell, Ryan;
Parovic, Marinela;
Reichart, Roi;
Korhonen, Anna
Published:
MIT Press, 2021
Published in:
Transactions of the Association for Computational Linguistics, 9 (2021), Seite 410-428
Language:
English
DOI:
10.1162/tacl_a_00374
ISSN:
2307-387X
Origination:
Footnote:
Description:
AbstractMost combinations of NLP tasks and language varieties lack in-domain examples for supervised training because of the paucity of annotated data. How can neural models make sample-efficient generalizations from task–language combinations with available data to low-resource ones? In this work, we propose a Bayesian generative model for the space of neural parameters. We assume that this space can be factorized into latent variables for each language and each task. We infer the posteriors over such latent variables based on data from seen task–language combinations through variational inference. This enables zero-shot classification on unseen combinations at prediction time. For instance, given training data for named entity recognition (NER) in Vietnamese and for part-of-speech (POS) tagging in Wolof, our model can perform accurate predictions for NER in Wolof. In particular, we experiment with a typologically diverse sample of 33 languages from 4 continents and 11 families, and show that our model yields comparable or better results than state-of-the-art, zero-shot cross-lingual transfer methods. Our code is available at github.com/cambridgeltl/parameter-factorization.