tf.keras.activations.hard_sigmoid

View source on GitHub

Hard sigmoid activation function.

tf.keras.activations.hard_sigmoid(
    x
)

Faster to compute than sigmoid activation.

For example:

>>> a = tf.constant([-3.0,-1.0, 0.0,1.0,3.0], dtype = tf.float32)
>>> b = tf.keras.activations.hard_sigmoid(a)
>>> b.numpy()
array([0. , 0.3, 0.5, 0.7, 1. ], dtype=float32)

Arguments:

Returns:

The hard sigmoid activation: