I cannot get to an stable neural network :/

1 visualización (últimos 30 días)
Fereshteh....
Fereshteh.... el 11 de Dic. de 2014
Respondida: Greg Heath el 12 de Dic. de 2014
i have written a code it is about training a neural network , no matter what i do its training performance is not stable , i know that there is always some differences between training performances in each run but in my case the results are so much different, some times my network experiences overfitting some times it experiences underfitting sometimes its training error is fair enough, it is getting really frustrating i don't know what to do!!!!i have changed my traing functions to see if i can get better performance, for each of them i had to change my network structure too but at the end , i again faced with the same problem
if true
net=feedforwardnet(100);
net.trainFcn='traingdm';
net.divideParam.trainRatio=1 net.divideParam.valRatio=0 net.divideParam.testRatio=0
net.trainparam.epochs=1000; [net,tr]=train(net,xl,TL); Y=net(xl);
end
there is a specific reason for putting net.divideParam.trainRatio=1

Respuesta aceptada

Greg Heath
Greg Heath el 12 de Dic. de 2014
% No ending semicolons so that output is readily seen
% Modifications needed if I or O is not 1
% For many examples including (I~=1 | O~=1) search the NEWSGROUP using
% greg fitnet Ntrials
% Code below is illustrative. I didn't test it. Again, see my previous posts for details.
[ I N ] = size(x) % = ?
[ O N ] = size(t) % = ?
xt = [ x; t ];
minmaxxt = minmax(xt)
zx = zscore(x,1);
zt = zscore(t,1);
zxt = [ zx; zt ];
minmaxzxt = minmax(zxt)
figure(1) % 3 subplots zx, zt and zt vs zx
% Remove or modify outliers
Ntrneq = N*O % Number of training equations (1/0/0) division
% Nw = (I+1)*H+(H+1)*O Number of unknown weights for H hidden nodes
% Ntrneq >= Nw <==> H <= Hub (upper bound for no overfitting)
Hub = -1 + ceil( (Ntrneq-O)/(I+O+1) )
Hmax = % <= Hub
dH =
Hmin = % >= 1
Ntrials = 10
netmin = fitnet;
MSEmin = 1000
Imin = 0
Jmin = 0
rng('default') % For replicating results
j = 0
for h = Hmin: dH: Hmax
j=j+1
net = fitnet(h);
Nw = (I+1)*h+(h+1)*O
Ndof = Ntrneq-Nw
MSEgoal = 0.01*Ndof/Ntrneq;
net.divideFcn = 'dividetrain'; % 1/0/0
net.trainParam.goal = MSEgoal;
net.trainParam.min_grad = MSEgoal/100;
for i = 1 : Ntrials
net = configure(net,zx,zt);
[ net tr y e ] = train(net,zx,zt);
MSEtmp = mse(e);
if MSEtmp < MSEmin
netmin = net;
Imin = i
Jmin = j
MSEmin = MSEtmp
end
end
end
Hope this helps.
Thank you for formally accepting my answer
Greg

Más respuestas (1)

Greg Heath
Greg Heath el 12 de Dic. de 2014
H = 10 % default value used explicitly
net = fitnet( H ); % For regression
net.divideFcn = 'dividetrain'; % 1/0/0 division
rng( 'default' ) % In order to duplicate results later
[ net, tr ] = train( net, x, t);
y = net( x );
R2 = 1 - mse(t-y)/var(t,1) % Desire R2 >= 0.99
For many, many examples search the NEWSGROUP using
greg fitnet Ntrials
If you are still a glutton for punishment, search in ANSWERS
Hope this helps
Thank you for formally accepting my answer
Greg

Categorías

Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by