text
stringlengths 0
4.99k
|
---|
headers: Dictionary; optional custom HTTP headers. |
send_as_json: Boolean; whether the request should be sent as "application/json". |
Base Callback class |
Callback class |
tf.keras.callbacks.Callback() |
Abstract base class used to build new callbacks. |
Callbacks can be passed to keras methods such as fit, evaluate, and predict in order to hook into the various stages of the model training and inference lifecycle. |
To create a custom callback, subclass keras.callbacks.Callback and override the method associated with the stage of interest. See https://www.tensorflow.org/guide/keras/custom_callback for more information. |
Example |
>>> training_finished = False |
>>> class MyCallback(tf.keras.callbacks.Callback): |
... def on_train_end(self, logs=None): |
... global training_finished |
... training_finished = True |
>>> model = tf.keras.Sequential([tf.keras.layers.Dense(1, input_shape=(1,))]) |
>>> model.compile(loss='mean_squared_error') |
>>> model.fit(tf.constant([[1.0]]), tf.constant([[1.0]]), |
... callbacks=[MyCallback()]) |
>>> assert training_finished == True |
Attributes |
params: Dict. Training parameters (eg. verbosity, batch size, number of epochs...). |
model: Instance of keras.models.Model. Reference of the model being trained. |
The logs dictionary that callback methods take as argument will contain keys for quantities relevant to the current batch or epoch (see method-specific docstrings).Regression losses |
MeanSquaredError class |
tf.keras.losses.MeanSquaredError(reduction="auto", name="mean_squared_error") |
Computes the mean of squares of errors between labels and predictions. |
loss = square(y_true - y_pred) |
Standalone usage: |
>>> y_true = [[0., 1.], [0., 0.]] |
>>> y_pred = [[1., 1.], [1., 0.]] |
>>> # Using 'auto'/'sum_over_batch_size' reduction type. |
>>> mse = tf.keras.losses.MeanSquaredError() |
>>> mse(y_true, y_pred).numpy() |
0.5 |
>>> # Calling with 'sample_weight'. |
>>> mse(y_true, y_pred, sample_weight=[0.7, 0.3]).numpy() |
0.25 |
>>> # Using 'sum' reduction type. |
>>> mse = tf.keras.losses.MeanSquaredError( |
... reduction=tf.keras.losses.Reduction.SUM) |
>>> mse(y_true, y_pred).numpy() |
1.0 |
>>> # Using 'none' reduction type. |
>>> mse = tf.keras.losses.MeanSquaredError( |
... reduction=tf.keras.losses.Reduction.NONE) |
>>> mse(y_true, y_pred).numpy() |
array([0.5, 0.5], dtype=float32) |
Usage with the compile() API: |
model.compile(optimizer='sgd', loss=tf.keras.losses.MeanSquaredError()) |
MeanAbsoluteError class |
tf.keras.losses.MeanAbsoluteError( |
reduction="auto", name="mean_absolute_error" |
) |
Computes the mean of absolute difference between labels and predictions. |
loss = abs(y_true - y_pred) |
Standalone usage: |
>>> y_true = [[0., 1.], [0., 0.]] |
>>> y_pred = [[1., 1.], [1., 0.]] |
>>> # Using 'auto'/'sum_over_batch_size' reduction type. |
>>> mae = tf.keras.losses.MeanAbsoluteError() |
>>> mae(y_true, y_pred).numpy() |
0.5 |
>>> # Calling with 'sample_weight'. |
>>> mae(y_true, y_pred, sample_weight=[0.7, 0.3]).numpy() |
0.25 |
>>> # Using 'sum' reduction type. |
>>> mae = tf.keras.losses.MeanAbsoluteError( |
... reduction=tf.keras.losses.Reduction.SUM) |
>>> mae(y_true, y_pred).numpy() |
1.0 |
>>> # Using 'none' reduction type. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.