neuraltrain.losses.losses.ClipLoss¶
- class neuraltrain.losses.losses.ClipLoss(norm_kind: str | None = 'y', temperature: bool = True, symmetric: bool = True, reduction: str = 'mean')[source][source]¶
CLIP constrastive loss.
Contrastive Language-Image Pretraining (CLIP) loss from [1]. Default values reflect the configuration of the CLIP loss used in [2].
- Parameters:
norm_kind ({"x", "y", "xy"} or None) –
- How to normalize the estimates and/or candidates before computing their dot products.
'x': normalize estimates only.'y': normalize candidates only (approach originally used in brainmagick).'xy': normalize both estimates and candidates.None: do not normalize.
temperature (bool) – If True, use a learnable temperature parameter.
symmetric (bool) – If True, compute loss in both retrieval directions, i.e. retrieve candidates given estimates and retrieve estimates given candidates (requires estimates and candidates to be of the same shape). If False, only do the former.
reduction (str) – Reduction applied to the per-example cross-entropy loss (forwarded to
F.cross_entropy).
References
- forward(estimate: Tensor, candidate: Tensor) Tensor[source][source]¶
Warning: estimate and candidate are not necessarily symmetrical.
If estimate of shape [B, C] and candidate of shape [B’, C] with B’>=B, the first B samples of candidate are targets, while the remaining B’-B samples of candidate are only used as negatives.