View source on GitHub |
Computes softmax activations.
tf.nn.softmax(
logits, axis=None, name=None
)
This function performs the equivalent of
softmax = tf.exp(logits) / tf.reduce_sum(tf.exp(logits), axis)
logits
: A non-empty Tensor
. Must be one of the following types: half
,
float32
, float64
.axis
: The dimension softmax would be performed on. The default is -1 which
indicates the last dimension.name
: A name for the operation (optional).A Tensor
. Has the same type and shape as logits
.
InvalidArgumentError
: if logits
is empty or axis
is beyond the last
dimension of logits
.