View source on GitHub |
Rectified Linear Unit activation function.
Inherits From: Layer
tf.keras.layers.ReLU(
max_value=None, negative_slope=0, threshold=0, **kwargs
)
With default values, it returns element-wise max(x, 0)
.
Otherwise, it follows:
f(x) = max_value
for x >= max_value
,
f(x) = x
for threshold <= x < max_value
,
f(x) = negative_slope * (x - threshold)
otherwise.
Arbitrary. Use the keyword argument input_shape
(tuple of integers, does not include the samples axis)
when using this layer as the first layer in a model.
Same shape as the input.
max_value
: Float >= 0. Maximum activation value.negative_slope
: Float >= 0. Negative slope coefficient.threshold
: Float. Threshold value for thresholded activation.