Callbacks implemented in the fastai library

List of callbacks

fastai's training loop is highly extensible, with a rich callback system. See the callback docs if you're interested in writing your own callback. See below for a list of callbacks that are provided with fastai, grouped by the module they're defined in.

Every callback that is passed to Learner with the callback_fns parameter will be automatically stored as an attribute. The attribute name is snake-cased, so for instance ActivationStats will appear as learn.activation_stats (assuming your object is named learn).


This sub-package contains more sophisticated callbacks that each are in their own module. They are (click the link for more details):


Train with Leslie Smith's 1cycle annealing method.


Use fp16 to take advantage of tensor cores on recent NVIDIA GPUs for a 200% or more speedup.


Create your own multi-stage annealing schemes with a convenient API.


Data augmentation using the method from mixup: Beyond Empirical Risk Minimization


Use Leslie Smith's learning rate finder to find a good learning rate for training your model.


Convenient wrapper for registering and automatically deregistering PyTorch hooks. Also contains pre-defined hook callback: ActivationStats.

train and basic_train


Track per-batch and per-epoch smoothed losses and metrics.


Dynamically display a learning chart during training.


Freeze batchnorm layer moving average statistics for non-trainable layers.


Clips gradient during training.