core.modules.scheduler#

Classes#

CosineLRLambda

LRScheduler

Learning rate scheduler class for torch.optim learning rate schedulers

Module Contents#

class core.modules.scheduler.CosineLRLambda(warmup_epochs: int, warmup_factor: float, epochs: int, lr_min_factor: float)#
warmup_epochs#
lr_warmup_factor#
max_epochs#
lr_min_factor#
__call__(current_step: int) float#
class core.modules.scheduler.LRScheduler(optimizer, config)#

Learning rate scheduler class for torch.optim learning rate schedulers

Notes

If no learning rate scheduler is specified in the config the default scheduler is warmup_lr_lambda (fairchem.core.common.utils) not no scheduler, this is for backward-compatibility reasons. To run without a lr scheduler specify scheduler: “Null” in the optim section of the config.

Parameters:
  • optimizer (obj) – torch optim object

  • config (dict) – Optim dict from the input config

optimizer#
config#
step(metrics=None, epoch=None) None#
filter_kwargs(config)#
get_lr()#