core.modules.scheduler#
Classes#
Learning rate scheduler class for torch.optim learning rate schedulers |
Module Contents#
- class core.modules.scheduler.LRScheduler(optimizer, config)#
Learning rate scheduler class for torch.optim learning rate schedulers
Notes
If no learning rate scheduler is specified in the config the default scheduler is warmup_lr_lambda (fairchem.core.common.utils) not no scheduler, this is for backward-compatibility reasons. To run without a lr scheduler specify scheduler: “Null” in the optim section of the config.
- Parameters:
optimizer (obj) – torch optim object
config (dict) – Optim dict from the input config
- optimizer#
- config#
- step(metrics=None, epoch=None) None #
- filter_kwargs(config)#
- get_lr()#