Neural Networks - Feedforwardnet Configuration

5 visualizaciones (últimos 30 días)
Sherry X
Sherry X el 12 de Dic. de 2019
Editada: Sherry X el 12 de Dic. de 2019
I'm trying to use growing batch to train my datasets with a simple feedforwardnet to fit the data. I have a growing datasets which means one sample is generated after each learning period, as shown below.
At time k, , , before training, I normalized input and target to [0,1] ->
At time k+1, generate ,->
If I use the default feedforwardnet settings
net = feedforwardnet(32);
net.divideParam.trainRatio = 0.7; % training set [%]
net.divideParam.valRatio = 0.15; % validation set [%]
net.divideParam.testRatio = 0.15; % test set [%]
net.trainParam.epochs = 300;
net = init(net);
I can get some outputs from .
But since my inputs and outputs are normalized to [0,1], when I try to change the activation function as
net.layers{1}.transferFcn = 'logsig';
net.layers{2}.transferFcn = 'poslin';
can only show same values for different inputs (or 0), which is also the case when I tried to create multiple layers.
Could anyone let me know how to configure the net correctly if I change transferFcn.
Besides, if I normalize the inputs and outputs before fit them to the net, do I need to disable processFcn as well?
net.inputs{1}.processFcns = {};
net.outputs{1}.processFcns= {};
Many thanks!

Respuestas (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by