instanceNormalizationLayer
Instance normalization layer
Description
An instance normalization layer normalizes a mini-batch of data across each channel for each observation independently. To improve the convergence of training the convolutional neural network and reduce the sensitivity to network hyperparameters, use instance normalization layers between convolutional layers and nonlinearities, such as ReLU layers.
After normalization, the layer scales the input with a learnable scale factor γ and shifts it by a learnable offset β.
Creation
Description
layer = instanceNormalizationLayer
creates an instance
normalization layer.
layer = instanceNormalizationLayer(Name,Value)
creates an
instance normalization layer and sets the optional Epsilon
, Parameters and Initialization, Learning Rate and Regularization, and Name
properties using one or more name-value arguments. You can
specify multiple name-value arguments. Enclose each property name in quotes.
Example: instanceNormalizationLayer('Name','instancenorm')
creates
an instance normalization layer with the name
'instancenorm'
Properties
Examples
Algorithms
The instance normalization operation normalizes the elements xi of the input by first calculating the mean μI and variance σI2 over the spatial and time dimensions for each channel in each observation independently. Then, it calculates the normalized activations as
where ϵ is a constant that improves numerical stability when the variance is very small.
To allow for the possibility that inputs with zero mean and unit variance are not optimal for the operations that follow instance normalization, the instance normalization operation further shifts and scales the activations using the transformation
where the offset β and scale factor γ are learnable parameters that are updated during network training.
Version History
Introduced in R2021a