How to change input values for weight classfication layer.
Mostrar comentarios más antiguos
I am using weigth classfication fucntion which given as example in MATALAB documentaion.
But whenI use it in my network it gives error "Error using 'backwardLoss' in Layer weightedClassificationLayer. The function threw an error and could not be executed". I think the error is due to input value but i am not sure where to change these valuse. The weighted classification function works well according to input valuse assigned in example.
The link of example https://in.mathworks.com/help/deeplearning/ug/create-custom-weighted-cross-entropy-classification-layer.html
the code I am using for weighted classification function
%%%%%%
classdef weightedClassificationLayer < nnet.layer.ClassificationLayer
properties
% Row vector of weights corresponding to the classes in the
% training data.
ClassWeights
end
methods
function layer = weightedClassificationLayer(classWeights, name)
% layer = weightedClassificationLayer(classWeights) creates a
% weighted cross entropy loss layer. classWeights is a row
% vector of weights corresponding to the classes in the order
% that they appear in the training data.
%
% layer = weightedClassificationLayer(classWeights, name)
% additionally specifies the layer name.
% Set class weights.
layer.ClassWeights = classWeights;
% Set layer name.
if nargin == 2
layer.Name = name;
end
% Set layer description
layer.Description = 'Weighted cross entropy';
end
function loss = forwardLoss(layer, Y, T)
% loss = forwardLoss(layer, Y, T) returns the weighted cross
% entropy loss between the predictions Y and the training
% targets T.
N = size(Y,4);
Y = squeeze(Y);
T = squeeze(T);
W = layer.ClassWeights;
loss = -sum(W*(T.*log(Y)))/N;
end
function dLdY = backwardLoss(layer, Y, T)
% dLdX = backwardLoss(layer, Y, T) returns the derivatives of
% the weighted cross entropy loss with respect to the
% predictions Y.
[~,~,K,N] = size(Y);
Y = squeeze(Y);
T = squeeze(T);
W = layer.ClassWeights;
dLdY = -(W'.*T./Y)/N;
dLdY = reshape(dLdY,[1 1 K N]);
end
end
end
Respuesta aceptada
Más respuestas (2)
Pujitha Narra
el 10 de Oct. de 2019
0 votos
Hi Raza Ali,
Can you mention how are you using 'weightedClassificationLayer' in your network? Assuming you want to know the inputs to the constructor of this class:
'classWeights' and the layer's 'name' are the only inputs.
'classWeights'-. classWeights is a row vector of weights corresponding to the classes in the order that they appear in the training data.
'name' -additionally specifies the layer name.
Also this example might be of help
Hope this helps!
8 comentarios
Raza Ali
el 10 de Oct. de 2019
Pujitha Narra
el 10 de Oct. de 2019
In the last line, when calling the constructor 'weightedClassificationLayer()', 'classWeights' are missing in the inputs. 'classWeights' are those which customize the classification layer according to your requirements.
Raza Ali
el 10 de Oct. de 2019
Raza Ali
el 10 de Oct. de 2019
Pujitha Narra
el 11 de Oct. de 2019
Hi Raza,
What do you mean when you say, you want to take classWeights from softmax layer?
Raza Ali
el 11 de Oct. de 2019
Raza Ali
el 11 de Oct. de 2019
evelyn
el 29 de Abr. de 2024
'ClassWeights', classWeights is a row vector of weights corresponding to the classes in the order that they appear in the training data.
how about the train data is shuffle? how to do that?
Ashwin
el 13 de Jul. de 2022
0 votos
Try to use classWeights' instead of classWeights
And check if it works
Categorías
Más información sobre Deep Learning Toolbox en Centro de ayuda y File Exchange.
Productos
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!