The supervised-reptile repository contains code associated with the paper “On First-Order Meta-Learning Algorithms”, which introduces Reptile, a meta-learning algorithm for learning model parameter initializations that adapt quickly to new tasks. The implementation here is aimed at supervised few-shot learning settings (e.g. Omniglot, Mini-ImageNet), not reinforcement learning, and includes scripts to run training and evaluation for few-shot classification. The fundamental idea is: sample a task, train on that task (inner loop), and then move the initialization parameters toward the adapted parameters (outer loop). Because Reptile is a first-order algorithm, it avoids computing second derivatives or full meta-gradients, making it computationally simpler while retaining good performance. The repo includes training scripts, dataset fetchers (Omniglot, Mini-ImageNet), and modules for defining the Reptile update logic, variables, and hyperparameters.
Features
- Implementation of the Reptile algorithm for few-shot supervised meta-learning
- Support scripts for Omniglot and Mini-ImageNet experiment setups
- First-order meta-learning (no second derivatives) for computational simplicity
- Command-line interface for hyperparameter control (shots, inner/outer loops, meta steps)
- Dataset download / preprocessing utilities (e.g. fetch_data.sh)
- Modular structure: reptile.py, variables.py, dataset modules, experiment drivers