MLP classification: what is the problem in my code?
Mostrar comentarios más antiguos
I would like to understand why the neural network with MLP I built works badly. The network should be a universal classifier because it has two hidden layers but, with the data set I use, my neural network does not train well. My network is built using
- in the first and second layers a sigmoidal transfer function
- in the output layer a soft max function"inputs" file is a 3x120 matrix: 3 features and 120 observations" targets" file is a 3x120 matrix (representative of 3 different classes).
In the example code I used a network with 40 neurons in the first layer and 20 in the second layer.
I notice two anomalies
- if I perform view (net) I see that the number of outputs of the output layer is 2 while the output box is 3: what does this mean and why Matlab does this?
- in the output layer a soft max functionIf I do sum (net ([i1;i2; i3]) I have a value different from 1 but it should be 1 because in the last layer there is a softmax function.
In the following I write my code while I attach inputs and outputs file
Choose a Training Function
% For a list of all training functions type: help nntrain
% 'trainlm' is usually fastest.
% 'trainbr' takes longer but may be better for challenging problems.
% 'trainscg' uses less memory. Suitable in low memory situations.
trainFcn = 'trainbr'; % Scaled conjugate gradient backpropagation.
% Create a Pattern Recognition Network
hiddenLayerSize = [40 20];
net=feedforwardnet(hiddenLayerSize);%crea rete feedforward
%imposta funzione trasferimento
net.layers{1}.transferFcn = 'tansig'
net.layers{2}.transferFcn = 'tansig'
net.layers{3}.transferFcn = 'softmax'
% Setup Division of Data for Training, Validation, Testing
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
% Train the Network
[net,tr] = train(net,x,t);
% Test the Network
y = net(x);
e = gsubtract(t,y);
performance = perform(net,t,y)
tind = vec2ind(t);
yind = vec2ind(y);
percentErrors = sum(tind ~= yind)/numel(tind);
% View the Network
view(net)
% Plots
% Uncomment these lines to enable various plots.
%figure, plotperform(tr)
%figure, plottrainstate(tr)
%figure, ploterrhist(e)
figure, plotconfusion(t,y)
%figure, plotroc(t,y)

4 comentarios
Greg Heath
el 22 de Sept. de 2017
When you have a coding problem, post the results using one of the MATLAB NN examples. Type
help nndatasets
and
doc nndatasets
If you search both NEWGROUP and ANSWERS using
nndatasets
you will probably find answers for those examples.
Hope this helps.
Greg
Greg Heath
el 22 de Sept. de 2017
PLEASE EMAIL YOR DATA I CANNOT DOWNLOAD IT.
GREG
mike mike
el 22 de Sept. de 2017
Greg Heath
el 23 de Sept. de 2017
If you click on my name, you will see my community profile. The address in the profile is
heath@alumni.brown.edu
Greg
Respuesta aceptada
Más respuestas (1)
Greg Heath
el 22 de Sept. de 2017
GEH1: The best network function for classification is PATTERNNET
GEH2: Your targets should be 0,1 UNIT vectors.
GEH3: The best training function for classification is TRAINSCG
GEH4. ONE hidden layer suffices for a UNIVERSAL APPROXIMATOR
GEH5: Network creation involves the sizes of ONLY the hidden layers; NOT the sizes of the input and output!
GEH6. For STABILITY & GENERALIZATION to nontraining (e.g., validation, testing and unseen) data with similar summary statistics (e.g., mean, std, etc)
a. Use a VALIDATION subset
b. MINIMIZE the number of HIDDEN NODES
GEH7. Unable to download your data to test sum(softmax) claim.
GEH8. Take a look at my QUICKIES NEWSREADER posts.
Hope this helps.
Thank you for formally accepting my answer
Greg
Categorías
Más información sobre Pattern Recognition en Centro de ayuda y File Exchange.
Productos
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!