tf.compat.v1.losses.compute_weighted_loss

View source on GitHub

Computes the weighted loss.

tf.compat.v1.losses.compute_weighted_loss(
    losses, weights=1.0, scope=None, loss_collection=tf.GraphKeys.LOSSES,
    reduction=Reduction.SUM_BY_NONZERO_WEIGHTS
)

Args:

Returns:

Weighted loss Tensor of the same type as losses. If reduction is NONE, this has the same shape as losses; otherwise, it is scalar.

Raises:

Note:

When calculating the gradient of a weighted loss contributions from both losses and weights are considered. If your weights depend on some model parameters but you do not want this to affect the loss gradient, you need to apply tf.stop_gradient to weights before passing them to compute_weighted_loss.

Eager Compatibility

The loss_collection argument is ignored when executing eagerly. Consider holding on to the return value or collecting losses via a tf.keras.Model.