How do I create a neural network that will give multiple outputs?
6 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Muhammad Fakir
el 1 de Nov. de 2016
I have data in the form 141x18, i.e., there are 141 cases with 18 parameters each. I want to create a feedforward network that is capable of producing a weight matrix of 18x18, which is the weights from the second hidden layer to the output layer. I am only able to produce an output layer of 141x1. I am able to produce a network with structure 18-36-36-1, however, is it possible to create a network of 18-36-36-18? If change the net.outputConnect(2) = 18; I get the error: >Error using network/train (line 340) >Number of targets does not match net.numOutputs.
Please help. My script is :
clear;
net= feedforwardnet([36 36]);
net.Inputs{1}.size = 18; %(Number of neurons in the input layer)
%net
numLayers = 2; %(Total numbers of layers)
net.Layers{1}.size = 36;%(Number of neurons in first layer)
net.Layers{2}.size = 36; %(Number of neurons in first layer)
net.inputConnect(1) = 1;
net.layerConnect(2,1) = 18;
net.outputConnect(2) = 18;
net.targetConnect(2) = 1;
net.layers{1}.transferFcn = 'tansig';
net.layers{2}.transferFcn = 'purelin';
%net.biasConnect = [1;1];
net = init(net);
net.initFcn = 'initlay';
net.layers{1}.initFcn = 'initnw';
net.layers{2}.initFcn = 'initnw';
net.inputWeights{1,1}.initFcn = 'rands';
net.inputWeights{2,1}.initFcn = 'rands';
net.biases{1}.initFcn = 'rands';
net.biases{2}.initFcn = 'rands';
net.performFcn = 'mse';
net.trainFcn = 'trainlm';
net.trainParam.lr = 0.02;
net.trainParam.goal = 0.00010;
net.trainParam.mc = 0.09;
net.trainParam.epochs = 10000;
net.trainParam.show = 100;
numNN = 10;
nets = cell(1,numNN);
%net.input.processFcns = {'removeconstantrows','mapminmax'};
%net.output.processFcns = {'removeconstantrows','mapminmax'};
% Setup Division of Data for Training, Validation, Testing
% For a list of all data division functions type: help nndivide
net.divideFcn = 'dividerand'; % Divide data randomly
net.divideMode = 'sample'; % Divide up every sample
net.divideParam.trainRatio = 90/100;
net.divideParam.valRatio = 5/100;
net.divideParam.testRatio = 5/100;
% Choose a Performance Function
% For a list of all performance functions type: help nnperformance
net.performFcn = 'mse'; % Mean Squared Error
% Choose Plot Functions
% For a list of all plot functions type: help nnplot
net.plotFcns = {'plotperform','plottrainstate','ploterrhist', ...
'plotregression', 'plotfit'};
load data.mat
A = transpose(A1);
B = transpose (B1);
[An,minA,maxA] = premnmx(A)
[Bn,minB,maxB] = premnmx(B)
net = init(net);
[net,tr] = train(net,An,Bn);
%gensim (net);
a1 = sim(net,An);
a = postmnmx(a1,minB,maxB);
t = B - a;
perf = mse(t)
net.performFcn = 'mse';
x=[1:1:141]
plot(x,a,'r+:',x,B,'gd:')
% Test the Network
y = net(A);
e = gsubtract(B,y);
performance = perform(net,B,y)
% Recalculate Training, Validation and Test Performance
trainTargets = B .* tr.trainMask{1};
valTargets = B .* tr.valMask{1};
testTargets = B .* tr.testMask{1};
trainPerformance = perform(net,trainTargets,y)
valPerformance = perform(net,valTargets,y)
testPerformance = perform(net,testTargets,y)
IW = net.IW{1,1};
LW = net.LW{2,1};
bias = net.b{1};
IW = net.IW{1,1};
LW = net.LW{2,1};
bias = net.b{1};
0 comentarios
Respuesta aceptada
Greg Heath
el 4 de Nov. de 2016
x = randn(18,141);
t = randn(18,141);
net = feedforwardnet([ 36 36 ]);
net = train(net,x,t);
view(net)
Thank you for formally accepting my answer
Greg
1 comentario
Pkm
el 21 de Dic. de 2017
Editada: Pkm
el 21 de Dic. de 2017
How did you solve this? net.outputConnect is 1*N matrix, where N is number of layers. Whereas your output is 18? I'm facing same problem too! @Greg I tried to give the same way as you have given but it doesn't work as my data is 1x600 cell array of 960x1 matrices, with 600 timesteps of 960 elements.
Más respuestas (0)
Ver también
Categorías
Más información sobre Deep Learning Toolbox en Help Center y File Exchange.
Productos
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!