text
stringlengths
0
4.99k
>>> # Using 'sum' reduction type.
>>> h = tf.keras.losses.CategoricalHinge(
... reduction=tf.keras.losses.Reduction.SUM)
>>> h(y_true, y_pred).numpy()
2.8
>>> # Using 'none' reduction type.
>>> h = tf.keras.losses.CategoricalHinge(
... reduction=tf.keras.losses.Reduction.NONE)
>>> h(y_true, y_pred).numpy()
array([1.2, 1.6], dtype=float32)
Usage with the compile() API:
model.compile(optimizer='sgd', loss=tf.keras.losses.CategoricalHinge())
hinge function
tf.keras.losses.hinge(y_true, y_pred)
Computes the hinge loss between y_true and y_pred.
loss = mean(maximum(1 - y_true * y_pred, 0), axis=-1)
Standalone usage:
>>> y_true = np.random.choice([-1, 1], size=(2, 3))
>>> y_pred = np.random.random(size=(2, 3))
>>> loss = tf.keras.losses.hinge(y_true, y_pred)
>>> assert loss.shape == (2,)
>>> assert np.array_equal(
... loss.numpy(),
... np.mean(np.maximum(1. - y_true * y_pred, 0.), axis=-1))
Arguments
y_true: The ground truth values. y_true values are expected to be -1 or 1. If binary (0 or 1) labels are provided they will be converted to -1 or 1. shape = [batch_size, d0, .. dN].
y_pred: The predicted values. shape = [batch_size, d0, .. dN].
Returns
Hinge loss values. shape = [batch_size, d0, .. dN-1].
squared_hinge function
tf.keras.losses.squared_hinge(y_true, y_pred)
Computes the squared hinge loss between y_true and y_pred.
loss = mean(square(maximum(1 - y_true * y_pred, 0)), axis=-1)
Standalone usage:
>>> y_true = np.random.choice([-1, 1], size=(2, 3))
>>> y_pred = np.random.random(size=(2, 3))
>>> loss = tf.keras.losses.squared_hinge(y_true, y_pred)
>>> assert loss.shape == (2,)
>>> assert np.array_equal(
... loss.numpy(),
... np.mean(np.square(np.maximum(1. - y_true * y_pred, 0.)), axis=-1))
Arguments
y_true: The ground truth values. y_true values are expected to be -1 or 1. If binary (0 or 1) labels are provided we will convert them to -1 or 1. shape = [batch_size, d0, .. dN].
y_pred: The predicted values. shape = [batch_size, d0, .. dN].
Returns
Squared hinge loss values. shape = [batch_size, d0, .. dN-1].
categorical_hinge function
tf.keras.losses.categorical_hinge(y_true, y_pred)
Computes the categorical hinge loss between y_true and y_pred.
loss = maximum(neg - pos + 1, 0) where neg=maximum((1-y_true)*y_pred) and pos=sum(y_true*y_pred)
Standalone usage:
>>> y_true = np.random.randint(0, 3, size=(2,))
>>> y_true = tf.keras.utils.to_categorical(y_true, num_classes=3)
>>> y_pred = np.random.random(size=(2, 3))
>>> loss = tf.keras.losses.categorical_hinge(y_true, y_pred)
>>> assert loss.shape == (2,)
>>> pos = np.sum(y_true * y_pred, axis=-1)
>>> neg = np.amax((1. - y_true) * y_pred, axis=-1)
>>> assert np.array_equal(loss.numpy(), np.maximum(0., neg - pos + 1.))
Arguments
y_true: The ground truth values. y_true values are expected to be either {-1, +1} or {0, 1} (i.e. a one-hot-encoded tensor).
y_pred: The predicted values.
Returns
Categorical hinge loss values.
Probabilistic losses
BinaryCrossentropy class
tf.keras.losses.BinaryCrossentropy(
from_logits=False, label_smoothing=0, reduction="auto", name="binary_crossentropy"
)
Computes the cross-entropy loss between true labels and predicted labels.