mm.training.callbacks.TensorboardCallback

class mrmustard.training.callbacks.TensorboardCallback(tag=None, steps_per_call=1, root_logdir='./tb_logdir', experiment_tag=None, prefix=None, cost_converter=None, track_grads=False, log_objectives=True, log_trainables=False)[source]

Bases: Callback

Callback for enabling Tensorboard tracking of optimization progresses.

Things tracked:

  • the cost

  • the transformed cost, if a cost_converter is provided

  • trainable parameter values

  • trainable parameter gradients (if track_grads is True)

To start the Tensorboard frontend, either:

  • use VSCode: F1 -> Tensorboard -> select your root_logdir/experiment_tag.

  • use command line: tensorboard –logdir=root_logdir/experiment_tag and open link in browser.

cost_converter

Transformation on cost for the purpose of better interpretation.

experiment_tag

The tag for experiment subfolder to group similar optimizations together for easy comparisons.

log_objectives

Whether to return objectives in the callback results to be stored.

log_trainables

Whether to return parameter values in the callback results to be stored.

prefix

Extra prefix to name the optimization experiment.

root_logdir

The root logdir for tensorboard logging.

steps_per_call

Sets calling frequency of this callback.

tag

Custom tag for a callback instance to be used as keys in Optimizer.callback_history.

track_grads

Whether to track gradients as well as the values for trainable parameters.

cost_converter: Optional[Callable] = None

Transformation on cost for the purpose of better interpretation.

experiment_tag: Optional[str] = None

The tag for experiment subfolder to group similar optimizations together for easy comparisons. Defaults to the hash of all trainable variables’ names.

log_objectives: bool = True

Whether to return objectives in the callback results to be stored.

log_trainables: bool = False

Whether to return parameter values in the callback results to be stored.

prefix: Optional[str] = None

Extra prefix to name the optimization experiment.

root_logdir: Union[str, Path] = './tb_logdir'

The root logdir for tensorboard logging.

steps_per_call: int = 1

Sets calling frequency of this callback. Defaults to once per optimization step. Use higher values to reduce overhead.

tag: str = None

Custom tag for a callback instance to be used as keys in Optimizer.callback_history. Defaults to the class name.

track_grads: bool = False

Whether to track gradients as well as the values for trainable parameters.

__call__(**kwargs)

Call self as a function.

call(optimizer, cost, trainables, **kwargs)

Logs costs and parameters to Tensorboard.

get_opt_step(optimizer, **kwargs)

Gets current step from optimizer.

init_writer(trainables)

Initializes tb logdir folders and writer.

trigger(**kwargs)

User implemented custom trigger conditions.

update_cost_fn(**kwargs)

User implemented cost_fn modifier.

update_grads(**kwargs)

User implemented gradient modifier.

update_optimizer(optimizer, **kwargs)

User implemented optimizer update scheduler.

__call__(**kwargs)

Call self as a function.

call(optimizer, cost, trainables, **kwargs)[source]

Logs costs and parameters to Tensorboard.

get_opt_step(optimizer, **kwargs)

Gets current step from optimizer.

init_writer(trainables)[source]

Initializes tb logdir folders and writer.

trigger(**kwargs)

User implemented custom trigger conditions.

Return type:

bool

update_cost_fn(**kwargs)

User implemented cost_fn modifier.

Return type:

Optional[Callable]

update_grads(**kwargs)

User implemented gradient modifier.

Return type:

Optional[Sequence]

update_optimizer(optimizer, **kwargs)

User implemented optimizer update scheduler.