How can I set the parameters of the feedforward neural network?
2 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Xiaomin Li
el 4 de Jul. de 2017
Comentada: Xiaomin Li
el 13 de Jul. de 2017
How can I set the parameters of the feedforward neural network? How can I find the optimal number of hidden layers, number of nodes each layer? Thanks a lot!
0 comentarios
Respuesta aceptada
Greg Heath
el 5 de Jul. de 2017
For run of the mill problems, you can use default settings except for
a. The number of hidden nodes (default is 10)
b. The initial weights and biases (the default, RANDOM, is best)
When
[ I N ] = size(input)
[ O N ] = size(target)
% Network topology is I - H - O
Nval = Ntst = round(0.15*N)
Ntrn = N - Nval - Ntst
% Number of training equations
Ntrneq = Ntrn*O % ~0.7*N*O
% No. of unknown weights and biases
% Nw = ( I + 1 )*H +( H + 1 )*O
Nw = O + (I + O + 1 )* H
% OVERFITTING (more unknowns than equations)
H > Hub = (Ntrneq-O)/(I+O+1)
To prevent overtraining an overfit net and impair its ability to perform well on nontraining data, one or a combination of the following can be implemented:
a. H <= Hub % Don't overfit!
b. Train with VALIDATION STOPPING to prevent poor
performance on the validation subset and other
(e.g., testing and unseen ) data
c. Use REGULARIZATION (see help/doc TRAINBR) to add
weighted sums of squared weights to the minimization function.
I tend to use VALIDATION STOPPING and a double loop approach to minimizing H by trial and error with random initial weights.
Search the NEWSGROUP and ANSWERS using
Hmin:dH:Hmax
Hope this helps
Thank you for formally accepting my answer
Greg
Más respuestas (0)
Ver también
Categorías
Más información sobre 使用 NARX 网络和时延网络进行建模和预测 en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!