core.common.logger#

Copyright (c) Meta, Inc. and its affiliates.

This source code is licensed under the MIT license found in the LICENSE file in the root directory of this source tree.

Classes#

Logger

Generic class to interface with various logging modules, e.g. wandb,

WandBLogger

Generic class to interface with various logging modules, e.g. wandb,

TensorboardLogger

Generic class to interface with various logging modules, e.g. wandb,

WandBSingletonLogger

Singleton version of wandb logger, this forces a single instance of the logger to be created and used from anywhere in the code (not just from the trainer).

Functions#

log_stats(x, prefix)

Module Contents#

class core.common.logger.Logger(config)#

Bases: abc.ABC

Generic class to interface with various logging modules, e.g. wandb, tensorboard, etc.

config#
abstract watch(model, log_freq: int = 1000)#

Monitor parameters and gradients.

log(update_dict, step: int, split: str = '')#

Log some values.

abstract log_plots(plots) None#
abstract mark_preempting() None#
abstract log_summary(summary_dict: dict[str, Any]) None#
abstract log_artifact(name: str, type: str, file_location: str) None#
class core.common.logger.WandBLogger(config)#

Bases: Logger

Generic class to interface with various logging modules, e.g. wandb, tensorboard, etc.

watch(model, log='all', log_freq: int = 1000) None#

Monitor parameters and gradients.

log(update_dict, step: int, split: str = '') None#

Log some values.

log_plots(plots, caption: str = '') None#
log_table(name: str, cols: list, data: list, step: int | None = None, commit=False) None#
log_summary(summary_dict: dict[str, Any])#
mark_preempting() None#
log_artifact(name: str, type: str, file_location: str) None#
class core.common.logger.TensorboardLogger(config)#

Bases: Logger

Generic class to interface with various logging modules, e.g. wandb, tensorboard, etc.

writer#
watch(model, log_freq: int = 1000) bool#

Monitor parameters and gradients.

log(update_dict, step: int, split: str = '')#

Log some values.

mark_preempting() None#
log_plots(plots) None#
log_summary(summary_dict: dict[str, Any]) None#
log_artifact(name: str, type: str, file_location: str) None#
class core.common.logger.WandBSingletonLogger#

Singleton version of wandb logger, this forces a single instance of the logger to be created and used from anywhere in the code (not just from the trainer). This will replace the original WandBLogger.

We initialize wandb instance somewhere in the trainer/runner globally:

WandBSingletonLogger.init_wandb(…)

Then from anywhere in the code we can fetch the singleton instance and log to wandb, note this allows you to log without knowing explicitly which step you are on see: https://docs.wandb.ai/ref/python/log/#the-wb-step for more details

WandBSingletonLogger.get_instance().log({“some_value”: value}, commit=False)

_instance = None#
classmethod initialized() bool#
classmethod init_wandb(config: dict, run_id: str, run_name: str, log_dir: str, project: str, entity: str, group: str | None = None) None#
classmethod get_instance()#
watch(model, log='all', log_freq: int = 1000) None#
log(update_dict: dict, step: int | None = None, commit=False, split: str = '') None#
log_table(name: str, cols: list, data: list, step: int | None = None, commit=False) None#
log_summary(summary_dict: dict[str, Any])#
mark_preempting() None#
log_artifact(name: str, type: str, file_location: str) None#
core.common.logger.log_stats(x: torch.Tensor, prefix: str)#