neural network training function

4 visualizaciones (últimos 30 días)
Suzanne Hussein
Suzanne Hussein el 18 de Mayo de 2015
Comentada: Greg Heath el 18 de Mayo de 2015
how can apply X size 18X99 matrix on NN_training function
...........................
function [net,tr]=NN_training(X,y,k,code,iter,par_vec)
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% FUNCTION
% net=NN_training(X,y,k,code,iter,par_vec)
% Returns a trained multilayer perceptron as a MATLAB object.
%
% INPUT ARGUMENTS:
% X: lxN matrix whose columns are the vectors of the
% data set.
% y: N-dimensional vector containing the class labels for the
% data vectors.
% code: a parameter that specifies the training algorithm to be used
% ("1" for standard BP, "2" for BP with momentum term and "3" BP
% with adaptive learning rate).
% iter: the maximum number of iterations that will be performed by the
% algorithm.
% par_vec: a five-dimensional vector containing the values of (i) the
% learning rate used in the standard BP algorithm, (ii) the
% momentum term used in the BP with momentum term and (iii) the
% three values involved in the BP with adaptive learning rate.
%
% OUTPUT ARGUMENTS:
% net: the neural network as a MATLAB object
% tr: training record (epoch and performance)
%
% (c) 2010 S. Theodoridis, A. Pikrakis, K. Koutroumbas, D. Cavouras
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
rand('seed',0); % Initialization of the random number generators
randn('seed',0);
% List of training methods
methods_list={'traingd'; 'traingdm'; 'traingda';'trainlm'; 'traingdx'};
% Limits of the region where data lie
limit=[min(X(:,1)) max(X(:,1)); min(X(:,2)) max(X(:,2))];
% Neural network definition
net=newff(limit,[k 1],{'tansig','purelin'}, methods_list{code,1}); % 'traingda')
% Neural network initialization
net=init(net);
% Setting parameters
net.trainParam.epochs=iter;
net.trainParam.lr=par_vec(1);
if(code==2)
net.trainParam.mc=par_vec(2);
elseif(code==3)
net.trainParam.lr_inc=par_vec(3);
net.trainParam.lr_dec=par_vec(4);
net.trainParam.max_perf_inc=par_vec(5);
end
% Neural network training
[net,tr]=train(net,X,y);
%NOTE: During training, the MATLAB shows a plot of the MSE vs the number of
%iterations
..............................................................

Respuesta aceptada

Greg Heath
Greg Heath el 18 de Mayo de 2015
I don't see why you would need this subroutine in 2015.
1. NEWFF and special cases NEWFIT for regression/curve-fitting and NEWPR for classification/pattern-recognition have been OBSOLETE since 2010b.
2. NNTOOL yields a comprehensive script which uses the currently supported corresponding functions FEEDFORWARDNET, FITNET and PATTERNET.
3. If you want, for some reason, to have a subroutine, use the script provided by NNTOOL.
Sorry for being so negative.
Greg
  1 comentario
Greg Heath
Greg Heath el 18 de Mayo de 2015
Furthermore, if you are a beginner, you will probably be better off using a slightly modified version of the simple script obtained when you use the help and doc commands. Example: for regression/curve-fitting both
help fitnet
doc fitnet
yield
[x,t] = simplefit_dataset;
net = fitnet; % default: same as fitnet(10)
net = train(net,x,t);
view(net)
y = net(x);
perf = perform(net,t,y)
with the obvious deficiencies
1. Dimensions and plots of x, t, and t vs x
2. What does the value perf mean? How do I know it is low enough?
3. Every time it is run I get a different answer
Actually, by looking at the automatically generated regression and fit plots you can get a good feel if the model is good or not. However, having an understandable numeric performance measure would be much better.
Since even the multiple option script obtained from NNTOOL doesn't address these points, I will post another NEWSGROUP tutorial in a few hours.

Iniciar sesión para comentar.

Más respuestas (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by