Borrar filtros
Borrar filtros

Create custom NARX net

6 visualizaciones (últimos 30 días)
Stefan
Stefan el 16 de Jun. de 2017
Respondida: Greg Heath el 20 de Jun. de 2017
Hi,
I'm strugglinng to create a series parallel architecture net(Pic1). I want to use this architecture to train my net.
Could somebody tell me how I can connect the ouptut to the first layer? Aferwards I'd like to use this net:
CODE (Pic2)
BattCurrent = Experiment.Results(1).BattCurrent__A_;
CellVolt = Experiment.Results(1).CellVolt__V_;
SOC = Experiment.Results(1).SOC__0_1_;
CellTemperature = Experiment.Results(1).CellTemperature__K_;
NumberOfChargeProcedures = Experiment.Results(1).NumberOfChargeProcedures____;
AgeingCapacity = Experiment.Results(1).AgeingCapacity;
% Input Vektor X
X = [BattCurrent CellVolt SOC CellTemperature NumberOfChargeProcedures]';
%X = con2seq(X);
%Output Vektor T
T = [AgeingCapacity]';
%T = con2seq(T);
[Xn,Xs] = mapminmax(X);
[Tn,Ts] = mapminmax(T);
% ANN
net = network;
net.name = 'Test';
net.numInputs = 1;
net.numLayers = 3;
net.biasConnect = [1; 1; 1];
net.inputConnect(1,1) = 1;
net.layerConnect(2,1) =1;
net.layerConnect(3,2) =1;
net.layerConnect(1,3) =1;
net.outputConnect(1,3) = 1;
%Layers
net.layers{1}.size = 15;
net.layers{1}.transferFcn = 'tansig';
net.layers{1}.initFcn = 'initnw';
net.layers{1}.name = 'Hidden Layer 1';
net.layers{2}.size = 15;
net.layers{2}.transferFcn = 'tansig';
net.layers{2}.initFcn = 'initnw';
net.layers{2}.name = 'Hidden Layer 2';
net.layers{3}.size = 1;
net.layers{3}.transferFcn = 'purelin';
net.layers{3}.initFcn = 'initnw';
net.layers{3}.name = 'Output';
%NARX
net.layerWeights{1,3}.delays = [1];
%Functions
net.initFcn = 'initlay';
net.performFcn = 'mse';
net.trainFcn = 'trainbr';
net.divideFcn = 'dividerand';
%Plots
net.plotFcns = {'plotperform','plottrainstate'};
view(net)
Thank you in advance! Best, Stefan

Respuestas (2)

Jayaram Theegala
Jayaram Theegala el 19 de Jun. de 2017
You can use "closeloop" function to connect output to the first layer, in other words to convert a neural network into a closed loop network. For more information about this function, click on the following URL:
After creating the above closed loop network, you can create a feed forward network using the "feedforwardnet" function, and to find more information about this function click on the following MATLAB documentation page:

Greg Heath
Greg Heath el 20 de Jun. de 2017
See the documentation examples
help narxnet
doc narxnet
The only significant difference between your design and the documentation examples is that you have 2 hidden layers
However
1. Use DIVIDEBLOCK for training
Hope this helps.
Greg

Categorías

Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.

Etiquetas

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by