fastai's training loop is highly extensible, with a rich callback system. See the
callback docs if you're interested in writing your own callback. See below for a list of callbacks that are provided with fastai, grouped by the module they're defined in.
Every callback that is passed to
Learner with the
callback_fns parameter will be automatically stored as an attribute. The attribute name is snake-cased, so for instance
ActivationStats will appear as
learn.activation_stats (assuming your object is named
This sub-package contains more sophisticated callbacks that each are in their own module. They are (click the link for more details):
Train with Leslie Smith's 1cycle annealing method.
Use fp16 to take advantage of tensor cores on recent NVIDIA GPUs for a 200% or more speedup.
Create your own multi-stage annealing schemes with a convenient API.
Data augmentation using the method from mixup: Beyond Empirical Risk Minimization
Use Leslie Smith's learning rate finder to find a good learning rate for training your model.