Custom Neural Netwrok (Manually re-implementing "patternnet' using "network")

4 visualizaciones (últimos 30 días)
Ray Morano
Ray Morano el 28 de Abr. de 2017
Editada: Greg Heath el 28 de Abr. de 2017
Hello everyone,
I want to eventually build a complex-structured custom network. I thought maybe I should started with the simplest one so I tried building a 2 layer feedforward network (without using the app) but manually using 'network' function with the code given bellow:
net = network(1,2,[1;1],[1;0],[0,0;1,0],[0,1]);
net.layers{1}.transferFcn = 'tansig';
net.layers{2}.transferFcn = 'softmax';
net.inputWeights{1}.initFcn = 'initzero'; % set input wieghts init function
net.inputWeights{1}.learnFcn = 'learnp';
net.inputWeights{2}.initFcn = 'initzero'; % set input wieghts init function
net.inputWeights{2}.learnFcn = 'learnp';
net.layers{1}.size=50;
net.initFcn = 'initlay';
net.trainFcn = 'trainscg';
net.performFcn = 'crossentropy';
net.divideFcn = 'dividerand'; % Divide data randomly
net.divideMode = 'sample'; % Divide up every sample
net.divideParam.trainRatio = 75/100;
net.divideParam.valRatio = 20/100;
net.divideParam.testRatio = 5/100;
net.plotFcns = {'plotperform','plottrainstate','ploterrhist', ...
'plotconfusion', 'plotroc'};
net = configure(net,Samples,Targets);
net = init(net);
[net,tr] = train(net, Samples, Targets);
The network looks the exact same as the one with 'patternnet', and I tried to define all the parameters the same way. However when I train the same dataset I get two completely different results. The training results with the app and 'patternnet' make much more sense than the training with my own built network!! My network stops after 2 iterations with poor performance, the one using 'patternnet' goes beyond 200 iterations with fair performance.
Is there anything I am missing here, or is there any justification for this?
Thank you so much in advance!

Respuestas (3)

Steven Lord
Steven Lord el 28 de Abr. de 2017
Are you sure you used the same pre- and post-processing functions?
  1 comentario
Ray Morano
Ray Morano el 28 de Abr. de 2017
Thanks for the reply, Steven. Yes, It's not in the above code, but I actually tried the same pre and post processing as 'patternnet', which was the "removeconstantrows". I am wondering if I need to initialize the weights ?

Iniciar sesión para comentar.


Greg Heath
Greg Heath el 28 de Abr. de 2017
H = 50 is excessive and probably causes overfitting/training problems.
When starting something new, ALWAYS BEGIN with as many default values as possible and MATLAB help/doc documentation datasets.
Then after experimenting with parameter value changes, use the method on your own data and experiment with changing parameter settings.
Changing 1 parameter at a time tends to be rather foolproof but may be extremely tedious (especially if you have a large dataset). So you may want to reduce the size of your data set to a smaller one that is still representative of the whole set. A decent rule of thumb is to have 10 to 30 examples per input dimension
Hope this helps,
Thank you for formally accepting my answer
Greg
  1 comentario
Ray Morano
Ray Morano el 28 de Abr. de 2017
Thanks for your reply, Greg. I have no concern regarding over fitting and overall performance of the network, the number of neurons (i.e., 50) has been the same when I set up the pattern recognition network with the mathlab app. Furthermore, all other parameters (activation Fnc, etc.) that I'm defining is in order to mimic the "patternnet".
The question is why I am not able to re-implement "patternnet" manually using "network" function?

Iniciar sesión para comentar.


Greg Heath
Greg Heath el 28 de Abr. de 2017
Editada: Greg Heath el 28 de Abr. de 2017
1. Initialize the RNG to the same initial state.
2. Transform inputs and targets to [-1 1 ] before learning and transform the output back using the target inverse transform after learning.
Hope this helps.
Thank you for formally accepting my answer
Greg

Categorías

Más información sobre Deep Learning Toolbox en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by