text
stringlengths
0
4.99k
-3.3328766e-03, 1.0988748e-06, 1.4491909e-04]], dtype=float32)>
Supporting masking in your custom layers
Sometimes, you may need to write layers that generate a mask (like Embedding), or layers that need to modify the current mask.
For instance, any layer that produces a tensor with a different time dimension than its input, such as a Concatenate layer that concatenates on the time dimension, will need to modify the current mask so that downstream layers will be able to properly take masked timesteps into account.
To do this, your layer should implement the layer.compute_mask() method, which produces a new mask given the input and the current mask.
Here is an example of a TemporalSplit layer that needs to modify the current mask.
class TemporalSplit(keras.layers.Layer):
"""Split the input tensor into 2 tensors along the time dimension."""
def call(self, inputs):
# Expect the input to be 3D and mask to be 2D, split the input tensor into 2
# subtensors along the time axis (axis 1).
return tf.split(inputs, 2, axis=1)
def compute_mask(self, inputs, mask=None):
# Also split the mask into 2 if it presents.
if mask is None:
return None
return tf.split(mask, 2, axis=1)
first_half, second_half = TemporalSplit()(masked_embedding)
print(first_half._keras_mask)
print(second_half._keras_mask)
tf.Tensor(
[[ True True True]
[ True True True]
[ True True True]], shape=(3, 3), dtype=bool)
tf.Tensor(
[[False False False]
[ True True False]
[ True True True]], shape=(3, 3), dtype=bool)
Here is another example of a CustomEmbedding layer that is capable of generating a mask from input values:
class CustomEmbedding(keras.layers.Layer):
def __init__(self, input_dim, output_dim, mask_zero=False, **kwargs):
super(CustomEmbedding, self).__init__(**kwargs)
self.input_dim = input_dim
self.output_dim = output_dim
self.mask_zero = mask_zero
def build(self, input_shape):
self.embeddings = self.add_weight(
shape=(self.input_dim, self.output_dim),
initializer="random_normal",
dtype="float32",
)
def call(self, inputs):
return tf.nn.embedding_lookup(self.embeddings, inputs)
def compute_mask(self, inputs, mask=None):
if not self.mask_zero:
return None
return tf.not_equal(inputs, 0)
layer = CustomEmbedding(10, 32, mask_zero=True)
x = np.random.random((3, 10)) * 9
x = x.astype("int32")
y = layer(x)
mask = layer.compute_mask(x)
print(mask)
tf.Tensor(
[[ True True True True True True True True True True]
[ True True True True True True True True True True]
[ True True True True True True True True True True]], shape=(3, 10), dtype=bool)
Opting-in to mask propagation on compatible layers
Most layers don't modify the time dimension, so don't need to modify the current mask. However, they may still want to be able to propagate the current mask, unchanged, to the next layer. This is an opt-in behavior. By default, a custom layer will destroy the current mask (since the framework has no way to tell whether propagating the mask is safe to do).
If you have a custom layer that does not modify the time dimension, and if you want it to be able to propagate the current input mask, you should set self.supports_masking = True in the layer constructor. In this case, the default behavior of compute_mask() is to just pass the current mask through.
Here's an example of a layer that is whitelisted for mask propagation:
class MyActivation(keras.layers.Layer):
def __init__(self, **kwargs):
super(MyActivation, self).__init__(**kwargs)
# Signal that the layer is safe for mask propagation
self.supports_masking = True
def call(self, inputs):
return tf.nn.relu(inputs)
You can now use this custom layer in-between a mask-generating layer (like Embedding) and a mask-consuming layer (like LSTM), and it will pass the mask along so that it reaches the mask-consuming layer.
inputs = keras.Input(shape=(None,), dtype="int32")
x = layers.Embedding(input_dim=5000, output_dim=16, mask_zero=True)(inputs)
x = MyActivation()(x) # Will pass the mask along
print("Mask found:", x._keras_mask)
outputs = layers.LSTM(32)(x) # Will receive the mask
model = keras.Model(inputs, outputs)
Mask found: Tensor("embedding_4/NotEqual:0", shape=(None, None), dtype=bool)
Writing layers that need mask information
Some layers are mask consumers: they accept a mask argument in call and use it to determine whether to skip certain time steps.