Ctrl K

EE_MAML

Code supporting the publication: When MAML Learns Quickly, Does It Generalize Well?

3
contributors

Description

The code implements Model-Agnostic Meta-Learning (MAML), a framework that trains a model to learn how to learn across tasks, in simple linear and nonlinear regression settings. It repeatedly samples tasks, performs a small number of inner-loop gradient updates to adapt the model to each task, and then updates the shared initialization using the post-adaptation performance across tasks. The experiments are done with sampling from synthetic datasets, and for fast adaptability with minimal gradient steps. This code specifically explores restricting the number of these inner-loop updates and compares the resulting generalization performance against standard baseline learners on synthetic regression problems.

Logo of EE_MAML
Keywords
Programming languages
  • Python 52%
  • Other 48%
License
  • GPL-3.0-only
</>Source code
4TU.
Packages

Contributors

Member of community

4TU