This module has the necessary functions to be able to download several useful datasets that we might be interested in using in our models.
This contains all the datasets' and models' URLs, and some classmethods to help use them - you don't create objects of this class. The supported datasets are (with their calling name):
MNIST. To get details on the datasets you can see the fast.ai datasets webpage. Datasets with SAMPLE in their name are subsets of the original datasets. In the case of MNIST, we also have a TINY dataset which is even smaller than MNIST_SAMPLE.
Models is now limited to
WT103 but you can expect more in the future!
For the rest of the datasets you will need to download them with
untar_data will decompress the data file and download it while
download_data will just download and save the compressed file in
By default, data will be downloaded to
Configure the default
data_path by editing
Note: If the data file already exists in a
data directory inside the notebook, that data file will be used instead of
~/.fasta/data. Paths are resolved by calling the function
datapath4file - which checks if data exists locally (
data/) first, before downloading to
~/.fastai/data home directory.
All the downloading functions use this to decide where to put the tgz and expanded folder. If
filename already exists in a
data directory in the same place as the calling notebook/script, that is used as the parent directly, otherwise
~/.fastai/config.yml is read to see what path to use, which defaults to
~/.fastai/data is used. To override this default, simply modify the value in your
You probably won't need to use this yourself - it's used by