These methods are automatically added to all
Learner objects created after importing this module. They provide convenient access to a number of callbacks, without requiring them to be manually created.
slice(None, 0.003, None),
Fit a model following the 1cycle policy.
A last extension method comes from the module tta.
Applies Test Time Augmentation to
learn on the dataset
ds_type. We take the average of our regular predictions (with a weight
beta) with the average of predictions obtained thourh augmented versions of the training set (with a weight
1-beta). The transforms decided for the training set are applied with a few changes
scale controls the scale for zoom (which isn't random), the cropping isn't random but we make sure to get the four corners of the image. Flipping isn't random but applied once on each of those corner images (so that makes 8 augmented versions total).
We'll show examples below using our MNIST sample.
path = untar_data(URLs.MNIST_SAMPLE) data = ImageDataBunch.from_folder(path)
learn = create_cnn(data, models.resnet18, metrics=accuracy, callback_fns=ShowGraph) learn.fit(3)
If we have
last_metrics, plot them in
self.pbar. Set the size of the graph with
Clips gradient at a maximum absolute value of
clip during training. For instance:
learn = create_cnn(data, models.resnet18, metrics=accuracy, callback_fns=partial(GradientClipping, clip=0.1)) learn.fit(1)
Total time: 00:11 epoch train loss valid loss accuracy 0 0.086958 0.038721 0.989696 (00:11)
Clip the gradients after they are computed but before the optimizer step.
For batchnorm layers where
requires_grad==False, you generally don't want to update their moving average statistics, in order to avoid the model's statistics getting out of sync with its pre-trained weights. You can add this callback to automate this freezing of statistics (internally, it calls
eval on these layers).
learn = create_cnn(data, models.resnet18, metrics=accuracy, callback_fns=BnFreeze) learn.fit(1)
Total time: 00:07 epoch train loss valid loss accuracy 0 0.079278 0.041832 0.985280 (00:07)