Loss

Loss and criterion wrappers for training and evaluation.

This module provides:

  • Loss — derivable loss functions (includes L1/L2 regularization and all PyTorch losses) with arithmetic composition support.

  • Criterion — non-derivable evaluation metrics (top-k accuracy, sigmoid accuracy).

Example:

>>> from experiments import Loss, Criterion
>>> loss = Loss("crossentropy") + 0.01 * Loss("l2")
>>> crit = Criterion("top-k", k=5)
class experiments.loss.Criterion(name_build, *args, **kwargs)

Bases: object

Non-derivable evaluation metric wrapper.

Available criteria:

  • "top-k" — top-k classification accuracy.

  • "sigmoid" — binary accuracy with sigmoid threshold at 0.5.

All criteria return a 1-D tensor [num_correct, batch_size].

Parameters:
  • name_build (str or callable) – Criterion name or constructor function.

  • *args (object) – Forwarded to the criterion constructor.

  • **kwargs (object) – Forwarded to the criterion constructor.

  • Raises

  • ------

  • tools.UnavailableException – If name_build is an unknown string.

class experiments.loss.Loss(name_build, *args, **kwargs)

Bases: object

Derivable loss function wrapper with composition support.

Losses can be added (loss1 + loss2) and scaled (0.5 * loss). All standard PyTorch losses are available by lower-case name. Additionally, "l1" and "l2" provide parameter-norm regularization.

Parameters:
  • name_build (str or callable) – Loss name (e.g. "crossentropy", "mse") or a callable with signature (output, target, params) -> tensor.

  • *args (object) – Forwarded to the loss constructor when name_build is a string.

  • **kwargs (object) – Forwarded to the loss constructor when name_build is a string.

  • Raises

  • ------

  • tools.UnavailableException – If name_build is an unknown string.

See also

For the model that uses the loss, see Model. For the optimizer that minimizes it, see Optimizer.