MetaBayes:a meta-learning framework from a Bayesian perspective

Autor: Tamara AlShammari, Anis Elgabli, Mehdi Bennis
Jazyk: angličtina
Rok vydání: 2022
Popis: Meta-learning is a powerful learning paradigm in which solving a new task can benefit from similar tasks for faster adaption (few shot learning). Stochastic gradient descent (SGD) based meta learning has emerged as an attractive solution in the few-shot learning. However, this approach suffers from significant computational complexity due to the double loop and matrix inversion operations which incurs a significant amount of uncertainty and poor generalization. To achieve lower complexity and better generalization, in this paper, we propose MetaBayes, a novel framework that views the original meta learning problem from a Bayesian perspective where the meta-model is cast as the prior distribution and the task-specific models are viewed as task-specific posterior distributions. The objective amounts to jointly optimizing the prior and the posterior distributions. With this, we obtain a closed-form expression to update the distributions at every iteration, to avoid the high computation cost issue of SGD based meta learning, and produce a more robust and generalized meta-model. Our simulations show that tasks with few training samples achieves higher accuracy when MetaBayes prior distribution is used as an initializer compared to the commonly-used Gaussian prior distribution.
Databáze: OpenAIRE