tf.nn.softmax

View source on GitHub

Computes softmax activations.

tf.nn.softmax(
    logits, axis=None, name=None
)

This function performs the equivalent of

softmax = tf.exp(logits) / tf.reduce_sum(tf.exp(logits), axis)

Args:

Returns:

A Tensor. Has the same type and shape as logits.

Raises: