tf.compat.v1.nn.crelu

View source on GitHub

Computes Concatenated ReLU.

tf.compat.v1.nn.crelu(
    features, name=None, axis=-1
)

Concatenates a ReLU which selects only the positive part of the activation with a ReLU which selects only the negative part of the activation. Note that as a result this non-linearity doubles the depth of the activations. Source: Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units. W. Shang, et al.

Args:

Returns:

A Tensor with the same type as features.