@@ -80,6 +80,17 @@ given the prediction `ŷ` and true values `y`.
8080 | 0.5 * |ŷ - y|^2, for |ŷ - y| <= δ
8181 Huber loss = |
8282 | δ * (|ŷ - y| - 0.5 * δ), otherwise
83+
84+ # Example
85+ ```jldoctest
86+ julia> ŷ = [1.1, 2.1, 3.1];
87+
88+ julia> Flux.huber_loss(ŷ, 1:3) # default δ = 1 > |ŷ - y|
89+ 0.005000000000000009
90+
91+ julia> Flux.huber_loss(ŷ, 1:3, δ=0.05) # changes behaviour as |ŷ - y| > δ
92+ 0.003750000000000005
93+ ```
8394"""
8495function huber_loss (ŷ, y; agg = mean, δ = ofeltype (ŷ, 1 ))
8596 _check_sizes (ŷ, y)
@@ -377,12 +388,22 @@ function kldivergence(ŷ, y; dims = 1, agg = mean, ϵ = epseltype(ŷ))
377388end
378389
379390"""
380- poisson_loss(ŷ, y)
391+ poisson_loss(ŷ, y; agg = mean )
381392
382- # Return how much the predicted distribution `ŷ` diverges from the expected Poisson
383- # distribution `y`; calculated as `sum(ŷ .- y .* log.(ŷ)) / size(y, 2)`.
393+ Return how much the predicted distribution `ŷ` diverges from the expected Poisson
394+ distribution `y`; calculated as -
395+
396+ sum(ŷ .- y .* log.(ŷ)) / size(y, 2)
384397
385398[More information.](https://peltarion.com/knowledge-center/documentation/modeling-view/build-an-ai-model/loss-functions/poisson).
399+
400+ # Example
401+ ```jldoctest
402+ julia> y_model = [1, 3, 3]; # data should only take integral values
403+
404+ julia> Flux.poisson_loss(y_model, 1:3)
405+ 0.5023128522198171
406+ ```
386407"""
387408function poisson_loss (ŷ, y; agg = mean)
388409 _check_sizes (ŷ, y)
@@ -392,11 +413,32 @@ end
392413"""
393414 hinge_loss(ŷ, y; agg = mean)
394415
395- Return the [hinge_loss loss ](https://en.wikipedia.org/wiki/Hinge_loss) given the
416+ Return the [hinge_loss](https://en.wikipedia.org/wiki/Hinge_loss) given the
396417prediction `ŷ` and true labels `y` (containing 1 or -1); calculated as
397- `sum(max.(0, 1 .- ŷ .* y)) / size(y, 2)`.
398418
419+ sum(max.(0, 1 .- ŷ .* y)) / size(y, 2)
420+
421+ Usually used with classifiers like Support Vector Machines.
399422See also: [`squared_hinge_loss`](@ref)
423+
424+ # Example
425+ ```jldoctest
426+ julia> y_true = [1, -1, 1, 1];
427+
428+ julia> y_pred = [0.1, 0.3, 1, 1.5];
429+
430+ julia> Flux.hinge_loss(y_pred, y_true)
431+ 0.55
432+
433+ julia> Flux.hinge_loss(y_pred[1], y_true[1]) != 0 # same sign but |ŷ| < 1
434+ true
435+
436+ julia> Flux.hinge_loss(y_pred[end], y_true[end]) == 0 # same sign but |ŷ| >= 1
437+ true
438+
439+ julia> Flux.hinge_loss(y_pred[2], y_true[2]) != 0 # opposite signs
440+ true
441+ ```
400442"""
401443function hinge_loss (ŷ, y; agg = mean)
402444 _check_sizes (ŷ, y)
407449 squared_hinge_loss(ŷ, y)
408450
409451Return the squared hinge_loss loss given the prediction `ŷ` and true labels `y`
410- (containing 1 or -1); calculated as `sum((max.(0, 1 .- ŷ .* y)).^2) / size(y, 2)`.
452+ (containing 1 or -1); calculated as
453+
454+ sum((max.(0, 1 .- ŷ .* y)).^2) / size(y, 2)
411455
456+ Usually used with classifiers like Support Vector Machines.
412457See also: [`hinge_loss`](@ref)
458+
459+ # Example
460+ ```jldoctes
461+ julia> y_true = [1, -1, 1, 1];
462+
463+ julia> y_pred = [0.1, 0.3, 1, 1.5];
464+
465+ julia> Flux.squared_hinge_loss(y_pred, y_true)
466+ 0.625
467+
468+ julia> Flux.squared_hinge_loss(y_pred[1], y_true[1]) != 0
469+ true
470+
471+ julia> Flux.squared_hinge_loss(y_pred[end], y_true[end]) == 0
472+ true
473+
474+ julia> Flux.squared_hinge_loss(y_pred[2], y_true[2]) != 0
475+ true
476+ ```
413477"""
414478function squared_hinge_loss (ŷ, y; agg = mean)
415479 _check_sizes (ŷ, y)
422486Return a loss based on the dice coefficient.
423487Used in the [V-Net](https://arxiv.org/abs/1606.04797) image segmentation
424488architecture.
425- Similar to the F1_score. Calculated as:
489+ The dice coefficient is similar to the F1_score. Loss calculated as:
426490
427491 1 - 2*sum(|ŷ .* y| + smooth) / (sum(ŷ.^2) + sum(y.^2) + smooth)
492+
493+ # Example
494+ ```jldoctest
495+ julia> y_pred = [1.1, 2.1, 3.1];
496+
497+ julia> Flux.dice_coeff_loss(y_pred, 1:3)
498+ 0.000992391663909964
499+
500+ julia> 1 - Flux.dice_coeff_loss(y_pred, 1:3) # ~ F1 score for image segmentation
501+ 0.99900760833609
502+ ```
428503"""
429504function dice_coeff_loss (ŷ, y; smooth = ofeltype (ŷ, 1.0 ))
430505 _check_sizes (ŷ, y)
436511
437512Return the [Tversky loss](https://arxiv.org/abs/1706.05721).
438513Used with imbalanced data to give more weight to false negatives.
439- Larger β weigh recall more than precision (by placing more emphasis on false negatives)
514+ Larger β weigh recall more than precision (by placing more emphasis on false negatives).
440515Calculated as:
441- 1 - sum(|y .* ŷ| + 1) / (sum(y .* ŷ + β*(1 .- y) .* ŷ + (1 - β)*y .* (1 .- ŷ)) + 1)
516+
517+ 1 - sum(|y .* ŷ| + 1) / (sum(y .* ŷ + (1 - β)*(1 .- y) .* ŷ + β*y .* (1 .- ŷ)) + 1)
518+
442519"""
443520function tversky_loss (ŷ, y; β = ofeltype (ŷ, 0.7 ))
444521 _check_sizes (ŷ, y)
@@ -456,6 +533,8 @@ The input, 'ŷ', is expected to be normalized (i.e. [softmax](@ref Softmax) out
456533
457534For `γ == 0`, the loss is mathematically equivalent to [`Losses.binarycrossentropy`](@ref).
458535
536+ See also: [`Losses.focal_loss`](@ref) for multi-class setting
537+
459538# Example
460539```jldoctest
461540julia> y = [0 1 0
@@ -473,9 +552,6 @@ julia> ŷ = [0.268941 0.5 0.268941
473552julia> Flux.binary_focal_loss(ŷ, y) ≈ 0.0728675615927385
474553true
475554```
476-
477- See also: [`Losses.focal_loss`](@ref) for multi-class setting
478-
479555"""
480556function binary_focal_loss (ŷ, y; agg= mean, γ= 2 , ϵ= epseltype (ŷ))
481557 _check_sizes (ŷ, y)
@@ -536,7 +612,17 @@ which can be useful for training Siamese Networks. It is given by
536612 agg(@. (1 - y) * ŷ^2 + y * max(0, margin - ŷ)^2)
537613
538614Specify `margin` to set the baseline for distance at which pairs are dissimilar.
539-
615+
616+ # Example
617+ ```jldoctest
618+ julia> ŷ = [0.5, 1.5, 2.5];
619+
620+ julia> Flux.siamese_contrastive_loss(ŷ, 1:3)
621+ -4.833333333333333
622+
623+ julia> Flux.siamese_contrastive_loss(ŷ, 1:3, margin = 2)
624+ -4.0
625+ ```
540626"""
541627function siamese_contrastive_loss (ŷ, y; agg = mean, margin:: Real = 1 )
542628 _check_sizes (ŷ, y)
0 commit comments