Main Content

# relu

Apply rectified linear unit activation

## Description

The rectified linear unit (ReLU) activation operation performs a nonlinear threshold operation, where any input value less than zero is set to zero.

This operation is equivalent to

$f\left(x\right)=\left\{\begin{array}{cc}x,& x>0\\ 0,& x\le 0.\end{array}$

Note

This function applies the ReLU operation to dlarray data. If you want to apply ReLU activation within a layerGraph object or Layer array, use the following layer:

example

Y = relu(X) computes the ReLU activation of the input X by applying a threshold operation. All values in X that are less than zero are set to zero.

## Examples

collapse all

Use the relu function to set negative values in the input data to zero.

Create the input data as a single observation of random values with a height and width of 12 and 32 channels.

height = 12;
width = 12;
channels = 32;
observations = 1;

X = randn(height,width,channels,observations);
X = dlarray(X,'SSCB');

Compute the leaky ReLU activation.

Y = relu(X);

All negative values in X are now set to 0.

## Input Arguments

collapse all

Input data, specified as a formatted dlarray, an unformatted dlarray, or a numeric array.

Data Types: single | double

## Output Arguments

collapse all

ReLU activations, returned as a dlarray. The output Y has the same underlying data type as the input X.

If the input data X is a formatted dlarray, Y has the same dimension format as X. If the input data is not a formatted dlarray, Y is an unformatted dlarray with the same dimension order as the input data.

## Version History

Introduced in R2019b