View source on GitHub
|
Thresholded Rectified Linear Unit.
Inherits From: Layer
tf.keras.layers.ThresholdedReLU(
theta=1.0, **kwargs
)
f(x) = x for x > theta,
f(x) = 0 otherwise.
Arbitrary. Use the keyword argument input_shape
(tuple of integers, does not include the samples axis)
when using this layer as the first layer in a model.
Same shape as the input.
theta: Float >= 0. Threshold location of activation.