text
stringlengths 0
4.99k
|
---|
Epoch 2/2 |
782/782 [==============================] - 1s 978us/step - loss: 0.1679 - sparse_categorical_accuracy: 0.9511 - val_loss: 0.1637 - val_sparse_categorical_accuracy: 0.9529 |
The returned history object holds a record of the loss values and metric values during training: |
history.history |
{'loss': [0.3402276635169983, 0.15610544383525848], |
'sparse_categorical_accuracy': [0.9048200249671936, 0.9537400007247925], |
'val_loss': [0.1809607595205307, 0.16366209089756012], |
'val_sparse_categorical_accuracy': [0.9474999904632568, 0.9528999924659729]} |
We evaluate the model on the test data via evaluate(): |
# Evaluate the model on the test data using `evaluate` |
print("Evaluate on test data") |
results = model.evaluate(x_test, y_test, batch_size=128) |
print("test loss, test acc:", results) |
# Generate predictions (probabilities -- the output of the last layer) |
# on new data using `predict` |
print("Generate predictions for 3 samples") |
predictions = model.predict(x_test[:3]) |
print("predictions shape:", predictions.shape) |
Evaluate on test data |
79/79 [==============================] - 0s 846us/step - loss: 0.1587 - sparse_categorical_accuracy: 0.9513 |
test loss, test acc: [0.15874555706977844, 0.9513000249862671] |
Generate predictions for 3 samples |
predictions shape: (3, 10) |
Now, let's review each piece of this workflow in detail. |
The compile() method: specifying a loss, metrics, and an optimizer |
To train a model with fit(), you need to specify a loss function, an optimizer, and optionally, some metrics to monitor. |
You pass these to the model as arguments to the compile() method: |
model.compile( |
optimizer=keras.optimizers.RMSprop(learning_rate=1e-3), |
loss=keras.losses.SparseCategoricalCrossentropy(), |
metrics=[keras.metrics.SparseCategoricalAccuracy()], |
) |
The metrics argument should be a list -- your model can have any number of metrics. |
If your model has multiple outputs, you can specify different losses and metrics for each output, and you can modulate the contribution of each output to the total loss of the model. You will find more details about this in the Passing data to multi-input, multi-output models section. |
Note that if you're satisfied with the default settings, in many cases the optimizer, loss, and metrics can be specified via string identifiers as a shortcut: |
model.compile( |
optimizer="rmsprop", |
loss="sparse_categorical_crossentropy", |
metrics=["sparse_categorical_accuracy"], |
) |
For later reuse, let's put our model definition and compile step in functions; we will call them several times across different examples in this guide. |
def get_uncompiled_model(): |
inputs = keras.Input(shape=(784,), name="digits") |
x = layers.Dense(64, activation="relu", name="dense_1")(inputs) |
x = layers.Dense(64, activation="relu", name="dense_2")(x) |
outputs = layers.Dense(10, activation="softmax", name="predictions")(x) |
model = keras.Model(inputs=inputs, outputs=outputs) |
return model |
def get_compiled_model(): |
model = get_uncompiled_model() |
model.compile( |
optimizer="rmsprop", |
loss="sparse_categorical_crossentropy", |
metrics=["sparse_categorical_accuracy"], |
) |
return model |
Many built-in optimizers, losses, and metrics are available |
In general, you won't have to create your own losses, metrics, or optimizers from scratch, because what you need is likely to be already part of the Keras API: |
Optimizers: |
SGD() (with or without momentum) |
RMSprop() |
Adam() |
etc. |
Losses: |
MeanSquaredError() |
KLDivergence() |
CosineSimilarity() |
etc. |
Metrics: |
AUC() |
Precision() |
Recall() |
etc. |
Custom losses |
If you need to create a custom loss, Keras provides two ways to do so. |
The first method involves creating a function that accepts inputs y_true and y_pred. The following example shows a loss function that computes the mean squared error between the real data and the predictions: |
def custom_mean_squared_error(y_true, y_pred): |
return tf.math.reduce_mean(tf.square(y_true - y_pred)) |
model = get_uncompiled_model() |
model.compile(optimizer=keras.optimizers.Adam(), loss=custom_mean_squared_error) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.