View source on GitHub |
Computes Concatenated ReLU.
tf.compat.v1.nn.crelu(
features, name=None, axis=-1
)
Concatenates a ReLU which selects only the positive part of the activation with a ReLU which selects only the negative part of the activation. Note that as a result this non-linearity doubles the depth of the activations. Source: Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units. W. Shang, et al.
features
: A Tensor
with type float
, double
, int32
, int64
, uint8
,
int16
, or int8
.name
: A name for the operation (optional).axis
: The axis that the output values are concatenated along. Default is -1.A Tensor
with the same type as features
.