matlab neural network strange simulation performance

1 visualización (últimos 30 días)
ezoukova
ezoukova el 25 de Abr. de 2013
Hello, and thank you for giving me the chance to ask this question. I am a relative beginner in MATLAB & neural network concepts, trying to research into a prediction system of some economic and demographic data, basing on a neural network.
Having double input matrix (manually normalized) and double output matrix (manually normalized), after a long session of training and comparing performances (the less the better) for different functions, I have finally created five neural networks with the following sets of MATLAB functions:
1 newcf trainlm initnw mse learngd satlin
2 newcf trainlm initnw msne learngdm compet
3 newelm trainlm initwb mse learnhd purelin
4 newff trainlm initnw mse learngd purelin
5 newff trainlm initnw mse learnwh tansig
The best performance results (perf) for each one of them vary from xxxE-30 to xxxE-32.
But still, after running simulation of those networks for each single column of the input matrix, I got the expected output results in just 60% of the cases, while the other 40% are totally wrong.
I have exactly the same 60%/40% relationship between good and bad simulation results for all the above networks, with different bad columns per net.
Can something like this happen? What do you think could be wrong? Maybe the perf result when training is not enough to judge when a neural network is good enough? Maybe I didn't understand something well in the concepts?
Thank you in advance
P.S. the approximate (pseudo)code (MATLAB R2010) of the course of my actions is given below:
inp = {
0,1300 0,0300 0,0300 0,0300 -0,0100 0,0300 0,0900 0,0100 0,0600 0,0700;
0,0500 -0,0400 -0,0400 -0,0400 -0,0100 0,0100 0,1400 0,0900 0,0600 -0,1700;
............
}
outp={
0,0427 -0,1071 0,0605 -0,0637 -0,0410 0,2566 -0,0551 -0,0902 -0,2483 0,1543;
-0,0249 0,0192 -0,1199 -0,3748 0,3212 0,5490 -0,1655 -0,1213 -1,0236 0,4678;
0,1000 -0,1000 0,4000 0,2000 -0,3000 -0,9000 -0,6000 -0,7000 0,2000 0,4000;
...........
}
[out_r, out_c] = size(cell2mat(outp));
[inp_r, inp_c] = size(cell2mat(inp));
%------------ for 2 layers:
biasConnect = [1;1];
inputConnect = [1; 0];
layerConnect = [0 0; 1 0];
outputConnect = [0 1];
%-------------or for 3 layers:
% biasConnect = [1;1;1];
% inputConnect = [1; 0; 0];
% layerConnect = [0 0 0; 1 0 0; 0 1 0];
% outputConnect = [0 0 1];
my_net = network(1, 2, biasConnect,inputConnect,layerConnect,outputConnect); %create neural network
my_net.inputs{1}.size = inp_r; % set the number of elements in an input vector
my_net.outputs{1}.size = out_r; % set the number of elements in an output vector
my_net = newff(inp, outp, [inp_r out_r]); % Create feed-forward backpropagation network
% or for 3 layers: my_net = newff(inp, outp, [inp_r round(inp_r/2) out_r]);
my_net.layers{1}.size = inp_r; % 1st layer size
my_net.layers{1}.transferFcn = 'purelin'; % transfer function
my_net.layers{1}.initFcn = 'initnw';
%-------------------- Functions----------------------------------%/
my_net.divideFcn = 'divideblock';
my_net.plotFcns = {'plotperform','plottrainstate'};
my_net.initFcn ='initlay'; % Layer init function
my_net.performFcn = 'mse'; % performing function
my_net.trainFcn = 'trainlm'; % training function
my_net.adaptFcn = 'learngd'; % should be from list: learngdm, learngd
%-------------------set a few traininig params and train the net
my_net.trainParam.epochs = 100;
my_net.trainParam.goal = 1.0000e-030;
my_net.trainParam.max_fail = 3;
my_net.trainParam.mu = 1.0000e-03;
my_net.trainParam.mu_inc = 10;
my_net.trainParam.mu_dec = 0.1000;
my_net.trainParam.mu_max = 1e10;
my_net.trainParam.showWindow = false;
my_net.trainParam.showCommandLine = false;
my_net.trainParam.show = 0;
[my_net,tr] = train(my_net,inp,outp); % train the network
%---After all when the best my_net is found, I perform simulation for each column:
Y=sim(my_net, inp{:,i}) % for each i-column of the inp matrix and expect y=outp{:,i}

Respuesta aceptada

Greg Heath
Greg Heath el 27 de Abr. de 2013
What version do you have?
If this is classification/pattern-recognition use patternnet, newpr(OLD) or newff(VERY OLD)
If this is regression/curve-fitting use fitnet, newfit(OLD) or newff(VERY OLD)
1. Use ALL of the defaults as in the help examples e.g.,
help fitnet
2. Apply to the MATLAB nndataset that is most similar to yours.
help nndatasets
3. Apply to your data.
If you have any problems, post a reply with error messages and/or code.
Hope this helps.
Greg
  2 comentarios
ezoukova
ezoukova el 5 de Mayo de 2013
Dear Greg,
Thank you very much for your advice. I’ve tried working with MATLAB 2010a and 2012a. I’m trying to create for training purposes a simulation neural network having as input data (14x10) and as output data (5x10).
I don’t have error messages but I'm not satisfied with the simulation results. The value of the training record’s perf indicator is very low (about 4.73638e-033) while simulation on the same data sets gives rather high performance (about 333,56): Y=sim(my_net, input_data); current_performance = perform(my_net, output_data, Y);
It seems, I do not understand the exact meaning of the training record’s perf value and the resulting value of the perform function. Is there any correlation between them? What values should I expect in these indicators when the net is good enough? Thanking you in advance, Elena
Greg Heath
Greg Heath el 6 de Mayo de 2013
You did not answer my questions or follow my advice.

Iniciar sesión para comentar.

Más respuestas (0)

Categorías

Más información sobre Deep Learning Toolbox en Help Center y File Exchange.

Etiquetas

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by