Optimizer

Optimizer wrapper that resolves PyTorch optimizers by name.

This module provides a thin wrapper around torch.optim that allows optimizers to be instantiated from CLI strings while exposing a uniform interface for learning-rate adjustments.

Example:

>>> from experiments import Optimizer, Model
>>> model = Model("lenet", num_classes=10)
>>> optim = Optimizer("adam", model, lr=0.001)
>>> optim.set_lr(0.0001)
class experiments.optimizer.Optimizer(name_build, model, *args, **kwargs)

Bases: object

Optimizer wrapper with name resolution and LR control.

Parameters:
  • name_build (str or callable) – Optimizer name (e.g. "adam", "sgd") or a constructor function. Names are resolved against torch.optim.

  • model (experiments.Model) – Model whose parameters will be optimized.

  • *args (object) – Additional positional arguments forwarded to the optimizer constructor.

  • **kwargs (object) – Additional keyword arguments forwarded to the optimizer constructor.

  • Raises

  • ------

  • tools.UnavailableException – If name_build is a string that does not match any known optimizer.

set_lr(lr)

Set the learning rate for all parameter groups.

Parameters:

lr (float) – New learning rate.

See also

For the model whose parameters are optimized, see Model. For the loss that drives optimization, see Loss.