# is it important to normalise the input to a neural network before training

4 visualizaciones (últimos 30 días)
Newman el 15 de Jul. de 2016
Editada: Greg Heath el 27 de Jul. de 2016
I have a feature vector of the size 10000x400(400 samples) and target matrix is 40x400(40 classes).The input feature vecotr for each sample has 10,000 rows which have values like 0 123 212 242 123 45 etc.So I want ot ask that should I normalise all the elements in the rows by using the standard formula:
element of row=(element of row-mean(of column))/standard deviation (if same col).
##### 0 comentariosMostrar -2 comentarios más antiguosOcultar -2 comentarios más antiguos

Iniciar sesión para comentar.

Greg Heath el 16 de Jul. de 2016
Editada: Greg Heath el 27 de Jul. de 2016
1. Delete and/or modify numerical outliers. Standardization of data to
zero-mean/unit-variance is the most effective way to do this.
2. Keep the ranges of all input and target vector components comparable to help
understand their relative importance.
3. Consider biases to be weights that act on unit vector components
4. Keep the initial scalar products of weights and vectors within the linear regions
of the sigmoids to avoid algebraic stagnation in the asymptotic regions.
5. Data scaling to [-1 1 ] is a MATLAB default. Standardization and no scaling are the
alternatives. Since you already have unscaled and standardarized data, you have a
variety of choices. My choice is to use the standardized data but accept the
[-1 1 ] default.
Why? ... because it is the easiest to code and understand.
Hope this helps.
Thank you for formally accepting my answer
Greg
##### 2 comentariosMostrar NingunoOcultar Ninguno
Newman el 16 de Jul. de 2016
What do you mean by regularised data and how to accept the default - 1 to 1 ?Also one more question ,should I normalise the image input before extracting the feature vector or should I normalise the feature vector itself?
Greg Heath el 17 de Jul. de 2016
1. My bad. zero-mean/unit-variance is STANDARDIZATION.
MATLAB:
help zscore
doc zscore
NNTOOLBOX:
help mapstd
doc mapstd
2. Defaults do not have to be accepted. They are what
the algorithm uses if an alternative is not specified.
3. Typically, feature vectors are combined to create
feature matrices so that inputs and outputs are
matrices. If you decide to not accept the MAPMINMAX
default, you can use
a. MAPSTD
b. '' % No normalization
Hope this helps.
Greg

Iniciar sesión para comentar.

### Más respuestas (1)

Walter Roberson el 15 de Jul. de 2016
Editada: Walter Roberson el 15 de Jul. de 2016
Algebraically it is not important -- as long as you adjust your transfer functions appropriately. In practice, with floating point round-off and limited range, there could be some effects, which could be anywhere from minor to major, depending on your transfer functions.
Normalizing makes it a lot easier to compare the effects of different parameters. If A varies twice as much as B, is that because A is more important in determining the correlation, or is it because the range of A is higher and maybe A is actually less important? When you normalize then you do not have to think as much about how to interpret the results.
##### 0 comentariosMostrar -2 comentarios más antiguosOcultar -2 comentarios más antiguos

Iniciar sesión para comentar.

### Categorías

Más información sobre Deep Learning Toolbox en Help Center y File Exchange.

### Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by