Defined in tensorflow/_api/v1/keras/activations/__init__.py
.
Built-in activation functions.
Functions
elu(...)
: Exponential linear unit.
hard_sigmoid(...)
: Hard sigmoid activation function.
relu(...)
: Rectified Linear Unit.
selu(...)
: Scaled Exponential Linear Unit (SELU).
softmax(...)
: Softmax activation function.
softplus(...)
: Softplus activation function.
softsign(...)
: Softsign activation function.