text
stringlengths 0
4.99k
|
---|
...Predicting: start of batch 2; got log keys: [] |
...Predicting: end of batch 2; got log keys: ['outputs'] |
...Predicting: start of batch 3; got log keys: [] |
...Predicting: end of batch 3; got log keys: ['outputs'] |
...Predicting: start of batch 4; got log keys: [] |
...Predicting: end of batch 4; got log keys: ['outputs'] |
...Predicting: start of batch 5; got log keys: [] |
...Predicting: end of batch 5; got log keys: ['outputs'] |
...Predicting: start of batch 6; got log keys: [] |
...Predicting: end of batch 6; got log keys: ['outputs'] |
...Predicting: start of batch 7; got log keys: [] |
...Predicting: end of batch 7; got log keys: ['outputs'] |
Stop predicting; got log keys: [] |
Usage of logs dict |
The logs dict contains the loss value, and all the metrics at the end of a batch or epoch. Example includes the loss and mean absolute error. |
class LossAndErrorPrintingCallback(keras.callbacks.Callback): |
def on_train_batch_end(self, batch, logs=None): |
print("For batch {}, loss is {:7.2f}.".format(batch, logs["loss"])) |
def on_test_batch_end(self, batch, logs=None): |
print("For batch {}, loss is {:7.2f}.".format(batch, logs["loss"])) |
def on_epoch_end(self, epoch, logs=None): |
print( |
"The average loss for epoch {} is {:7.2f} " |
"and mean absolute error is {:7.2f}.".format( |
epoch, logs["loss"], logs["mean_absolute_error"] |
) |
) |
model = get_model() |
model.fit( |
x_train, |
y_train, |
batch_size=128, |
epochs=2, |
verbose=0, |
callbacks=[LossAndErrorPrintingCallback()], |
) |
res = model.evaluate( |
x_test, |
y_test, |
batch_size=128, |
verbose=0, |
callbacks=[LossAndErrorPrintingCallback()], |
) |
For batch 0, loss is 32.45. |
For batch 1, loss is 393.79. |
For batch 2, loss is 272.00. |
For batch 3, loss is 206.95. |
For batch 4, loss is 167.29. |
For batch 5, loss is 140.41. |
For batch 6, loss is 121.19. |
For batch 7, loss is 109.21. |
The average loss for epoch 0 is 109.21 and mean absolute error is 5.83. |
For batch 0, loss is 5.94. |
For batch 1, loss is 5.73. |
For batch 2, loss is 5.50. |
For batch 3, loss is 5.38. |
For batch 4, loss is 5.16. |
For batch 5, loss is 5.19. |
For batch 6, loss is 5.64. |
For batch 7, loss is 7.05. |
The average loss for epoch 1 is 7.05 and mean absolute error is 2.14. |
For batch 0, loss is 40.89. |
For batch 1, loss is 42.12. |
For batch 2, loss is 41.42. |
For batch 3, loss is 42.10. |
For batch 4, loss is 42.05. |
For batch 5, loss is 42.91. |
For batch 6, loss is 43.05. |
For batch 7, loss is 42.94. |
Usage of self.model attribute |
In addition to receiving log information when one of their methods is called, callbacks have access to the model associated with the current round of training/evaluation/inference: self.model. |
Here are of few of the things you can do with self.model in a callback: |
Set self.model.stop_training = True to immediately interrupt training. |
Mutate hyperparameters of the optimizer (available as self.model.optimizer), such as self.model.optimizer.learning_rate. |
Save the model at period intervals. |
Record the output of model.predict() on a few test samples at the end of each epoch, to use as a sanity check during training. |
Extract visualizations of intermediate features at the end of each epoch, to monitor what the model is learning over time. |
etc. |
Let's see this in action in a couple of examples. |
Examples of Keras callback applications |
Early stopping at minimum loss |
This first example shows the creation of a Callback that stops training when the minimum of loss has been reached, by setting the attribute self.model.stop_training (boolean). Optionally, you can provide an argument patience to specify how many epochs we should wait before stopping after having reached a local minimum. |
tf.keras.callbacks.EarlyStopping provides a more complete and general implementation. |
import numpy as np |
class EarlyStoppingAtMinLoss(keras.callbacks.Callback): |
"""Stop training when the loss is at its min, i.e. the loss stops decreasing. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.