Also, I'm using trainlm, an I noticed that it stops, because it reaches validation checks to quickly.
neural network poor performance
    8 visualizaciones (últimos 30 días)
  
       Mostrar comentarios más antiguos
    
Hello this is the first time I work with the neural network toolbox, I designed a network using newff, the goal is to approximate the input vector wich is a 4x600 matrix of MAV's taken from 4 muscles, to an output an expected angle.
I followed the instructions given at: http://www.mathworks.com/matlabcentral/answers/137-how-do-i-improve-my-neural-network-performance
however when I look at the regression plot, I'm getting a very low regression index, R=0.16882, the fit line it's almost horizontal and there is a lot of dispersion.
If someone could point me out in the right direction, I'd be gratefull.
Here is my code:
net = newff(Input,Target,20);
net.trainParam.goal=1e-6;
net.trainParam.max_fail=6;
net.performFcn='msereg';
net.performParam.ratio=0.5; 
net,tr] = train(net,Input,Target);
yTargets = sim(net,Input);
plotregression(Target,yTargets);
2 comentarios
  Hessam
 el 6 de En. de 2012
				I think you should change the parameter"net.divideFcn"to "dividerand". It's the default option for the nn when you do not specify the divide function explicitely. then you should use the following parameters to change the ratios of each part:
net.divideParam.trainRatio = 50/100;
net.divideParam.valRatio = 5/100;
net.divideParam.testRatio = 35/100;
However the validation check(the following parameter() is a very tricky part:
net.trainParam.max_fail
Hessam
Respuesta aceptada
  Greg Heath
      
      
 el 16 de Dic. de 2011
        Find, H, the smallest number of hidden nodes that will yield a satifactory design by looping over nH candidate values and trying Ntrials different designs for each value of H. Then choose the best of the nH*Ntrials designs.
I typically look at nH = 10, Ntrials = 10 first. Then accept or modify depending on the result.
Hope this helps.
Greg
2 comentarios
  Greg Heath
      
      
 el 8 de En. de 2012
				>Thanks Greg, I have tried that too. 
Hmm. Since that has ALWAYS worked for me, I would like to see your code.
Hope this helps.
Greg
Más respuestas (2)
  Mo al
 el 14 de Dic. de 2011
        you have to show your code. very quick suggestions 1- increase your performance goal. 2- initiate your network
0 comentarios
  Hessam
 el 6 de En. de 2012
        Just as a very important point in my work: How does the Performance function calculate the "mse"? is it normalized? otherwise it would be useless.
1 comentario
  Greg Heath
      
      
 el 8 de En. de 2012
				MSE = mse(target-output)
It can be normalized by MSE00, the MSE obtained from the naive constant model: 
output00 = repmat(mean(target,2),1,Ntrn)
MSE00 = mse(target-output00)
NMSE = MSE/MSE00 % normalized MSE
R2 = 1 - NMSE % R^2 statistic
Useless might be an appropriate characterization for multiple output 
nets when the outputs have different scales. Therefore, if outputs have different scales, it would be wise to normalize them. 
For regression with multiple real-valued outputs, I prefer standardization (zero-mean/unit-varance).
For classification among c classes, using targets that are columns of the c-dimensional unit matrix eye(c) is sufficient.
Hope this helps.
Greg
Ver también
Categorías
				Más información sobre Deep Learning Toolbox en Help Center y File Exchange.
			
	Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!



