Versioned name: ReLU-1

Category: Activation function

Short description: ReLU element-wise activation function. (Reference)

Detailed description: Reference

Attributes: ReLU operation has no attributes.

Mathematical Formulation

For each element from the input tensor calculates corresponding element in the output tensor with the following formula:

\[ Y_{i}^{( l )} = max(0, Y_{i}^{( l - 1 )}) \]


  • 1: Multidimensional input tensor x of any supported numeric type. Required.


  • 1: Result of ReLU function applied to the input tensor x. Tensor with shape and type matching the input tensor. Required.


<layer ... type="ReLU">
<port id="0">
<port id="1">