Loss¶
Loss and criterion wrappers for training and evaluation.
This module provides:
Loss— derivable loss functions (includes L1/L2 regularization and all PyTorch losses) with arithmetic composition support.Criterion— non-derivable evaluation metrics (top-k accuracy, sigmoid accuracy).
Example:¶
>>> from experiments import Loss, Criterion
>>> loss = Loss("crossentropy") + 0.01 * Loss("l2")
>>> crit = Criterion("top-k", k=5)
- class experiments.loss.Criterion(name_build, *args, **kwargs)¶
Bases:
objectNon-derivable evaluation metric wrapper.
Available criteria:
"top-k"— top-k classification accuracy."sigmoid"— binary accuracy with sigmoid threshold at 0.5.
All criteria return a 1-D tensor
[num_correct, batch_size].
- class experiments.loss.Loss(name_build, *args, **kwargs)¶
Bases:
objectDerivable loss function wrapper with composition support.
Losses can be added (
loss1 + loss2) and scaled (0.5 * loss). All standard PyTorch losses are available by lower-case name. Additionally,"l1"and"l2"provide parameter-norm regularization.- Parameters:
name_build (str or callable) – Loss name (e.g.
"crossentropy","mse") or a callable with signature(output, target, params) -> tensor.*args (object) – Forwarded to the loss constructor when
name_buildis a string.**kwargs (object) – Forwarded to the loss constructor when
name_buildis a string.Raises
------
tools.UnavailableException – If
name_buildis an unknown string.