tf.keras.activations.elu(
x,
alpha=1.0
)
Defined in tensorflow/python/keras/activations.py
.
Exponential linear unit.
Arguments:
x
: Input tensor.alpha
: A scalar, slope of negative section.
Returns:
The exponential linear activation: `x` if `x > 0` and
`alpha * (exp(x)-1)` if `x < 0`.
Reference: - Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs)