Neural Network sim(net, input) gives crazy results
6 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Flo Trentini
el 26 de Ag. de 2011
Respondida: Elaheh
el 10 de Dic. de 2013
Once I have trained my network I use the sim(net, input) function to get the results The results are incredible, and they are diffrent than the result obtained by manual matrix calculation using net.IW{1} etc. I obtain result around 10^4 with the sim, whereas result are around 1 with the matrix calculation !
Here is a zip file of the workspace and the code that provide what I am talking about
I also c opy paste the code here in case you want have a quick look
% in the workspace 'imp' is the data imput (7 variables for each input) and
% 'targ' is the data target
sz = size (imp);
% numbers of the coloumns for separation between train set, validation set, test set.
d1=round(sz(2)/2); % the half of the dataset
d2=round((sz(2)-d1)/2)+d1; % the half of the remaining part i.e. the quart
d3=sz(2); % the last quart
% network with 1 input layer (size of imput is 7) , one hidden layer of 5
% neurons, and one unique neurone in the output layer
net = newff(imp,targ,5);
% actual separation for train test and validation set
imp1=imp(:,1:d1); % imput for training
targ1=targ(:,1:d1); % target for training
VV.P=imp(:,d1+1:d2); % validation set
VV.T=targ(:,d1+1:d2);
VT.P=imp(:,d2+1:d3); % test set
VT.T=targ(:,d2+1:d3);
net.inputweights{1,1}.initfcn = 'rands';
net.layers{1}.transferFcn = 'tansig';
net.layers{2}.transferFcn = 'purelin';
net = init(net);
train(net,imp1,targ1,[],[],VV,VT);
% simulation on the full dataset
y1 = sim(net,imp);
% bias of layers addapted for direct calculation with the size of the dataset
B1 = net.b{1}*ones(1,size(imp,2)); % all the coloumns are identical, and equal to net.b{1}
B2 = net.b{2}*ones(1,size(imp,2));
OutLayer1 = tansig(net.IW{1}*imp+B1); % output from the layer 1 (the hidden layer)
OutLayer2 = purelin(net.LW{2}*OutLayer1+B2); % output from the layer 2 which is the output layer
y2 = OutLayer2; % just to give an easy name
% now you can compare y1 and y2
plot(1:d3,y1,'o',1:d3,y2,'x');
% NOTE THE *10^4 in the Y axis
0 comentarios
Respuesta aceptada
Lucas García
el 26 de Ag. de 2011
As surprising as it can be, the output of the network is correct. So, how come such huge differences from your correct math interpretation of the network?
Well, the network normalizes the input data before processing it through the network and then transforms the output back. This is done by the network with the function mapmimax. You can find it in your network and if you make the network not to use them, then you will obtain the same results as your math:
net.inputs{1}.processFcns
net.outputs{2}.processFcns
However, I don't recommend this. It is a good idea to normalize your data before you present it to the network or your weights could get too big.
In order to follow the math of the network, you can do the following:
imp2 = mapminmax('apply',imp,net.inputs{1}.processSettings{3});
OutLayer1 = tansig(net.IW{1}*imp2+B1);
OutLayer2 = purelin(net.LW{2}*OutLayer1+B2);
y2 = mapminmax('reverse',OutLayer2,net.outputs{2}.processSettings{2});
Now your plot of y1 and y2 should be the same.
Más respuestas (1)
Elaheh
el 10 de Dic. de 2013
Dear Locus,
I have done the same normalization as you said. The results of my sim function for "training dataset" is the same as MATLAB sim function. However, for "test dataset" my results is totally different than MATLAB sim function! I know, it is so strange! Could you please help me?
Thanks, Elaheh
0 comentarios
Ver también
Categorías
Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.
Productos
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!