tf.keras.activations.elu

View source on GitHub

Exponential linear unit.

tf.keras.activations.elu(
    x, alpha=1.0
)

Arguments:

Returns:

The exponential linear activation: x if x > 0 and alpha * (exp(x)-1) if x < 0.

Reference: