trw.hparams.creators
¶
Here we implement useful default hyper-parameter creators that are registered
in the trw.hparams.HyperParameterRepository
Module Contents¶
Functions¶
|
Create hyper-parameters for a wide range of optimizer search. |
|
Create activation functions |
|
Create a normalization layer type hyper-parameter |
|
Create a pooling type hyper-parameter |
Attributes¶
- trw.hparams.creators.logger¶
- trw.hparams.creators.create_optimizers_fn(datasets: trw.basic_typing.Datasets, model: torch.nn.Module, optimizers: Sequence[typing_extensions.Literal[adam, sgd]] = ('adam', 'sgd'), lr_range: Tuple[float, float, float] = (0.001, - 5, - 1), momentum: Sequence[float] = (0.5, 0.9, 0.99), beta_1: Sequence[float] = (0.9,), beta_2: Sequence[float] = (0.999, 0.99), eps: Sequence[float] = (1e-08,), weight_decay: Optional[Sequence[float]] = (0.0, 0.0001, 1e-05, 1e-06, 1e-08), name_prefix='trw.') torch.optim.Optimizer ¶
Create hyper-parameters for a wide range of optimizer search.
Hyper-parameters will be named using 2 groups of hyper-parameters: - trw.optimizers.*: most important hyper-parameters to search - trw.optimizers_fine.*: hyper-parameters that we might want to search but in most cases
would not significantly influence the results. These hyper-parameters maybe discarded during the hyper-parameter optimization
- Parameters
datasets – the datasets
model – the model to be optimized
optimizers – the optimizers to search
lr_range – the learning rate range (min, max)
momentum – the momentum values to test
beta_1 – the beta_1 values to test
beta_2 – the beta_2 values to test
eps – the epsilon values to test
weight_decay – the weight decay values to test
name_prefix – prefix appended to the hyper-parameter name
- Returns
A dict of optimizer per dataset
- trw.hparams.creators.create_activation(name: str, default_value: torch.nn.Module, functions: Sequence[trw.basic_typing.ModuleCreator] = (nn.ReLU, nn.ReLU6, nn.LeakyReLU, nn.ELU, nn.PReLU, nn.RReLU, nn.SELU, nn.CELU, nn.Softplus)) torch.nn.Module ¶
Create activation functions
- Parameters
name – the name of the hyper-parameter
functions – the activation functions
default_value – the default value at creation
- Returns
a functor to create the activation function
- trw.hparams.creators.create_norm_type(name: str, default_value: Optional[trw.layers.layer_config.NormType], norms: Sequence[Optional[trw.layers.layer_config.NormType]] = (NormType.BatchNorm, NormType.InstanceNorm, None)) trw.layers.layer_config.NormType ¶
Create a normalization layer type hyper-parameter
- Parameters
name – the name of the hyper-parameter
norms – a sequence of
NormType
default_value – the default value at creation
- Returns
a normalization layer type
- trw.hparams.creators.create_pool_type(name: str, default_value: trw.layers.layer_config.PoolType, pools: Sequence[trw.layers.layer_config.PoolType] = (PoolType.MaxPool, PoolType.AvgPool, PoolType.FractionalMaxPool)) trw.layers.layer_config.PoolType ¶
Create a pooling type hyper-parameter :param name: the name of the hyper-parameter :param pools: the available pooling types :param default_value: the default value at creation
- Returns
a pooling type