= untar_data(URLs.ADULT_SAMPLE)
path = pd.read_csv(path/'adult.csv')
df = ['workclass', 'education', 'marital-status', 'occupation', 'relationship', 'race']
cat_names = ['age', 'fnlwgt', 'education-num']
cont_names = [Categorify, FillMissing, Normalize]
procs = TabularDataLoaders.from_df(df, path, procs=procs, cat_names=cat_names, cont_names=cont_names,
dls ="salary", valid_idx=list(range(800,1000)), bs=64)
y_names= tabular_learner(dls) learn
Tabular learner
Learner
ready to train for tabular data
The main function you probably want to use in this module is tabular_learner
. It will automatically create a TabularModel
suitable for your data and infer the right loss function. See the tabular tutorial for an example of use in context.
Main functions
TabularLearner
TabularLearner (dls:fastai.data.core.DataLoaders, model:Callable, loss_func:Optional[Callable]=None, opt_func:fastai.optimi zer.Optimizer|fastai.optimizer.OptimWrapper=<function Adam>, lr:float|slice=0.001, splitter:Callable=<function trainable_params>, cbs:fastai.callback.core.Callback|coll ections.abc.MutableSequence|None=None, metrics:Union[Call able,collections.abc.MutableSequence,NoneType]=None, path:str|pathlib.Path|None=None, model_dir:str|pathlib.Path='models', wd:float|int|None=None, wd_bn_bias:bool=False, train_bn:bool=True, moms:tuple=(0.95, 0.85, 0.95), default_cbs:bool=True)
Learner
for tabular data
Type | Default | Details | |
---|---|---|---|
dls | DataLoaders | DataLoaders containing fastai or PyTorch DataLoader s |
|
model | Callable | PyTorch model for training or inference | |
loss_func | Optional | None | Loss function. Defaults to dls loss |
opt_func | fastai.optimizer.Optimizer | fastai.optimizer.OptimWrapper | Adam | Optimization function for training |
lr | float | slice | 0.001 | Default learning rate |
splitter | Callable | trainable_params | Split model into parameter groups. Defaults to one parameter group |
cbs | fastai.callback.core.Callback | collections.abc.MutableSequence | None | None | Callback s to add to Learner |
metrics | Union | None | Metric s to calculate on validation set |
path | str | pathlib.Path | None | None | Parent directory to save, load, and export models. Defaults to dls path |
model_dir | str | pathlib.Path | models | Subdirectory to save and load models |
wd | float | int | None | None | Default weight decay |
wd_bn_bias | bool | False | Apply weight decay to normalization and bias parameters |
train_bn | bool | True | Train frozen normalization layers |
moms | tuple | (0.95, 0.85, 0.95) | Default momentum for schedulers |
default_cbs | bool | True | Include default Callback s |
It works exactly as a normal Learner
, the only difference is that it implements a predict
method specific to work on a row of data.
tabular_learner
tabular_learner (dls:TabularDataLoaders, layers:list=None, emb_szs:list=None, config:dict=None, n_out:int=None, y_range:Tuple=None, loss_func:Callable|None=None, opt_func:Optimizer|OptimWrapper=<function Adam>, lr:float|slice=0.001, splitter:Callable=<function trainable_params>, cbs:Callback|MutableSequence|None=None, metrics:Callable|MutableSequence|None=None, path:str|Path|None=None, model_dir:str|Path='models', wd:float|int|None=None, wd_bn_bias:bool=False, train_bn:bool=True, moms:tuple=(0.95, 0.85, 0.95), default_cbs:bool=True)
Get a Learner
using dls
, with metrics
, including a TabularModel
created using the remaining params.
Type | Default | Details | |
---|---|---|---|
dls | DataLoaders | DataLoaders containing fastai or PyTorch DataLoader s |
|
layers | list | None | Size of the layers generated by LinBnDrop |
emb_szs | list | None | Tuples of n_unique, embedding_size for all categorical features |
config | dict | None | Config params for TabularModel from tabular_config |
n_out | int | None | Final output size of the model |
y_range | Tuple | None | Low and high for the final sigmoid function |
loss_func | Optional | None | Loss function. Defaults to dls loss |
opt_func | fastai.optimizer.Optimizer | fastai.optimizer.OptimWrapper | Adam | Optimization function for training |
lr | float | slice | 0.001 | Default learning rate |
splitter | Callable | trainable_params | Split model into parameter groups. Defaults to one parameter group |
cbs | fastai.callback.core.Callback | collections.abc.MutableSequence | None | None | Callback s to add to Learner |
metrics | Union | None | Metric s to calculate on validation set |
path | str | pathlib.Path | None | None | Parent directory to save, load, and export models. Defaults to dls path |
model_dir | str | pathlib.Path | models | Subdirectory to save and load models |
wd | float | int | None | None | Default weight decay |
wd_bn_bias | bool | False | Apply weight decay to normalization and bias parameters |
train_bn | bool | True | Train frozen normalization layers |
moms | tuple | (0.95, 0.85, 0.95) | Default momentum for schedulers |
default_cbs | bool | True | Include default Callback s |
If your data was built with fastai, you probably won’t need to pass anything to emb_szs
unless you want to change the default of the library (produced by get_emb_sz
), same for n_out
which should be automatically inferred. layers
will default to [200,100]
and is passed to TabularModel
along with the config
.
Use tabular_config
to create a config
and customize the model used. There is just easy access to y_range
because this argument is often used.
All the other arguments are passed to Learner
.
TabularLearner.predict
TabularLearner.predict (row:pandas.core.series.Series)
Predict on a single sample
Type | Details | |
---|---|---|
row | Series | Features to be predicted |
We can pass in an individual row of data into our TabularLearner
’s predict
method. It’s output is slightly different from the other predict
methods, as this one will always return the input as well:
= learn.predict(df.iloc[0]) row, clas, probs
row.show()
workclass | education | marital-status | occupation | relationship | race | education-num_na | age | fnlwgt | education-num | salary | |
---|---|---|---|---|---|---|---|---|---|---|---|
0 | Private | Assoc-acdm | Married-civ-spouse | #na# | Wife | White | False | 49.0 | 101320.001685 | 12.0 | <50k |
clas, probs
(tensor(0), tensor([0.5264, 0.4736]))