Callbacks implemented in the fastai library

List of callbacks

fastai's training loop is highly extensible, with a rich callback system. See the callback docs if you're interested in writing your own callback. See below for a list of callbacks that are provided with fastai, grouped by the module they're defined in.

Every callback that is passed to Learner with the callback_fns parameter will be automatically stored as an attribute. The attribute name is snake-cased, so for instance ActivationStats will appear as learn.activation_stats (assuming your object is named learn).

Callback

This sub-package contains more sophisticated callbacks that each are in their own module. They are (click the link for more details):

OneCycleScheduler

Train with Leslie Smith's 1cycle annealing method.

MixedPrecision

Use fp16 to take advantage of tensor cores on recent NVIDIA GPUs for a 200% or more speedup.

GeneralScheduler

Create your own multi-stage annealing schemes with a convenient API.

MixUpCallback

Data augmentation using the method from mixup: Beyond Empirical Risk Minimization

LRFinder

Use Leslie Smith's learning rate finder to find a good learning rate for training your model.

HookCallback

Convenient wrapper for registering and automatically deregistering PyTorch hooks. Also contains pre-defined hook callback: ActivationStats.

train and basic_train

Recorder

Track per-batch and per-epoch smoothed losses and metrics.

ShowGraph

Dynamically display a learning chart during training.

BnFreeze

Freeze batchnorm layer moving average statistics for non-trainable layers.

GradientClipping

Clips gradient during training.