Set a specific weight for a connection in neural networks

63 visualizaciones (últimos 30 días)
Abdelwahab Afifi
Abdelwahab Afifi el 9 de En. de 2020
Editada: Matthew Heberger el 2 de Mayo de 2022
I have already build my neural network. I wanna set a specific weight for the conection from layer i to layer j
foe example ...set weights from the inputs to the outputs = 1
How can i do this in MATLAB ?
Untitled.png
X=Calculations('comp2real' , PA_in); % convert nx1 complex vector to nx2 real vector
T=Calculations('comp2real' , PA_out);
%% Network structure
net = feedforwardnet(20);
% net.numInputs=2
% net.layers{2}.size=2;
net.biasConnect=[1;0];
% net.inputWeights{2,1}.weight=1;
net.inputWeights{2,1}.learn=0;
net.layers{1}.transferFcn = 'poslin';
% net.inputConnect=[1 1;1 1]
net.inputConnect=[1;1];
[net tr] = train(net,X,T);
plotperform(tr)
view(net)
wb=getwb(net);
Param_num=length(wb)
Evaluation
Y = net(X);
perf = perform(net,T,Y)
  1 comentario
Adam Danz
Adam Danz el 9 de En. de 2020
Have you tried to search for the answer to this question? Google has returned some useful examples and starting points to solving this. Searching the matlab documentation directly is also helpful. Without providing more context, how you created the neural network, etc; we're shooting in the dark. I'd be interested in hearing what solutions you've found and where we can help implement them.

Iniciar sesión para comentar.

Respuestas (2)

Srivardhan Gadila
Srivardhan Gadila el 24 de En. de 2020
Please refer to the following Weight and Bias Values, Input Weights, Layer Weights
If by "I have already build my neural network" you imply that
1. The network architecture is defined and has to be trained:
Then you can access the layer weights as follows:
net.LW{i,j}
You can set any values to the above weights and set the net.layerWeights{i,j}.learn to 0 so that the weights won't be altered during the training & adaption. In this case setting a specific weight for a connection is not possible since the property net.layerWeights{i,j}.learn is defined for the entire connections between layers i and j.
net.layerWeights{i,j}.learn = 0
net.LW{i,j} = ones(size(net.LW{i,j})) % any weights of size(net.LW{i,j})
2. The network architecture is defined and trained already:
Then you can set weight of a connection between nodes k & l of layers i & j as follows:
net.LW{i,j}(k,l) = 1
and then use the network.
The above things can be done to Input wieghts too.

Matthew Heberger
Matthew Heberger el 2 de Mayo de 2022
Editada: Matthew Heberger el 2 de Mayo de 2022
I found that it was difficult to set fixed Input Weights for a custom feedforward network (in Matlab 2022a). I wanted to set the weight from input 10 to layer 25 to -1, and for layer 25 to have a bias of 0, and used the following code:
net.biases{25}.learn = false;
net.b{25} = 0;
net.inputWeights{25, 10}.learn = false;
net.IW{25, 10} = -1; % This line caused an error
This gave the following error message at runtime:
Error using network/subsasgn>network_subsasgn
net.IW{25,10} must be a 1-by-0 matrix.
A colleague and I discovered that if we initialized the network first, then we can set the input weights:
net = configure(net, X, T); %configure the network first
net.biases{25}.learn = false;
net.b{25} = 0;
net.inputWeights{25, 10}.learn = false;
net.IW{25, 10} = -1; % Runs OK *after configuration*
It took us a long time to figure this out, and it feels like a Matlab bug, so I'm posting it here in the hopes it helps someone else.

Categorías

Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.

Productos

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by