text
stringlengths 0
4.99k
|
---|
Returns
|
Scalar test loss (if the model has a single output and no metrics) or list of scalars (if the model has multiple outputs and/or metrics). The attribute model.metrics_names will give you the display labels for the scalar outputs.
|
Raises
|
RuntimeError: If model.test_on_batch is wrapped in tf.function.
|
ValueError: In case of invalid user-provided arguments.
|
predict_on_batch method
|
Model.predict_on_batch(x)
|
Returns predictions for a single batch of samples.
|
Arguments
|
x: Input data. It could be:
|
A Numpy array (or array-like), or a list of arrays (in case the model has multiple inputs).
|
A TensorFlow tensor, or a list of tensors (in case the model has multiple inputs).
|
Returns
|
Numpy array(s) of predictions.
|
Raises
|
RuntimeError: If model.predict_on_batch is wrapped in tf.function.
|
ValueError: In case of mismatch between given number of inputs and expectations of the model.
|
run_eagerly property
|
tf.keras.Model.run_eagerly
|
Settable attribute indicating whether the model should run eagerly.
|
Running eagerly means that your model will be run step by step, like Python code. Your model might run slower, but it should become easier for you to debug it by stepping into individual layer calls.
|
By default, we will attempt to compile your model to a static graph to deliver the best execution performance.
|
Returns
|
Boolean, whether the model should run eagerly.
|
Model saving & serialization APIs
|
save method
|
Model.save(
|
filepath,
|
overwrite=True,
|
include_optimizer=True,
|
save_format=None,
|
signatures=None,
|
options=None,
|
save_traces=True,
|
)
|
Saves the model to Tensorflow SavedModel or a single HDF5 file.
|
Please see tf.keras.models.save_model or the Serialization and Saving guide for details.
|
Arguments
|
filepath: String, PathLike, path to SavedModel or H5 file to save the model.
|
overwrite: Whether to silently overwrite any existing file at the target location, or provide the user with a manual prompt.
|
include_optimizer: If True, save optimizer's state together.
|
save_format: Either 'tf' or 'h5', indicating whether to save the model to Tensorflow SavedModel or HDF5. Defaults to 'tf' in TF 2.X, and 'h5' in TF 1.X.
|
signatures: Signatures to save with the SavedModel. Applicable to the 'tf' format only. Please see the signatures argument in tf.saved_model.save for details.
|
options: (only applies to SavedModel format) tf.saved_model.SaveOptions object that specifies options for saving to SavedModel.
|
save_traces: (only applies to SavedModel format) When enabled, the SavedModel will store the function traces for each layer. This can be disabled, so that only the configs of each layer are stored. Defaults to True. Disabling this will decrease serialization time and reduce file size, but it requires that all custom layers/models implement a get_config() method.
|
Example
|
from keras.models import load_model
|
model.save('my_model.h5') # creates a HDF5 file 'my_model.h5'
|
del model # deletes the existing model
|
# returns a compiled model
|
# identical to the previous one
|
model = load_model('my_model.h5')
|
save_model function
|
tf.keras.models.save_model(
|
model,
|
filepath,
|
overwrite=True,
|
include_optimizer=True,
|
save_format=None,
|
signatures=None,
|
options=None,
|
save_traces=True,
|
)
|
Saves a model as a TensorFlow SavedModel or HDF5 file.
|
See the Serialization and Saving guide for details.
|
Usage:
|
>>> model = tf.keras.Sequential([
|
... tf.keras.layers.Dense(5, input_shape=(3,)),
|
... tf.keras.layers.Softmax()])
|
>>> model.save('/tmp/model')
|
>>> loaded_model = tf.keras.models.load_model('/tmp/model')
|
>>> x = tf.random.uniform((10, 3))
|
>>> assert np.allclose(model.predict(x), loaded_model.predict(x))
|
The SavedModel and HDF5 file contains:
|
the model's configuration (topology)
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.