Custom Weighted Classification Layer: Chnage in input value size
3 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Raza Ali
el 10 de Oct. de 2019
Comentada: Raza Ali
el 22 de Oct. de 2019
I am trying to change the input vlaue size from [1 1 x] to [50 50 x] (here x =1 or 2 or 3 or so on) in weightedClassificationLayer, but its giving error. I need to know where I can do changes so taht this function accept the different input values.
classdef weightedClassificationLayer < nnet.layer.ClassificationLayer
properties
% Row vector of weights corresponding to the classes in the
% training data.
ClassWeights
end
methods
function layer = weightedClassificationLayer(classWeights, name)
% layer = weightedClassificationLayer(classWeights) creates a
% weighted cross entropy loss layer. classWeights is a row
% vector of weights corresponding to the classes in the order
% that they appear in the training data.
% layer = weightedClassificationLayer(classWeights, name)
% additionally specifies the layer name
% Set class weights.
layer.ClassWeights = classWeights;
% Set layer name.
if nargin == 2
layer.Name = name;
end
% Set layer description
layer.Description = 'Weighted cross entropy';
end
function loss = forwardLoss(layer, Y, T)
% loss = forwardLoss(layer, Y, T) returns the weighted cross
% entropy loss between the predictions Y and the training
% targets T.
N = size(Y,4);
Y = squeeze(Y);
T = squeeze(T);
W = layer.ClassWeights;
loss = -sum(W*(T.*log(Y)))/N;
end
function dLdY = backwardLoss(layer, Y, T)
% dLdX = backwardLoss(layer, Y, T) returns the derivatives of
% the weighted cross entropy loss with respect to the
% predictions Y.
[H,Wi,K,N] = size(Y);
Y = squeeze(Y);
T = squeeze(T);
W = layer.ClassWeights;
dLdY = -(W'.*T./Y)/N;
dLdY = reshape(dLdY,[H Wi K N]);
end
end
end
to check the validity of this layer
classWeights = [0 1]
Size=size(classWeights)
layer = weightedClassificationLayer(classWeights);
numClasses = numel(classWeights)
validInputSize = [1 1 numClasses]
checkLayer(layer,validInputSize, 'ObservationDimension',4);
this works on Valid input size =[1 1 numClasses] but I am trying to change it to [x x numClasses] (x any number greater than 1)
0 comentarios
Respuesta aceptada
Divya Gaddipati
el 22 de Oct. de 2019
As I understand, you want to change the validInputSize to “[x x numClasses]”, which implies a single prediction will be of size [x x 1]. Hence, you also need to reshape your classWeights (which is W in your code) to [x x numClasses].
Modified your code below:
classdef weightedClassificationLayer < nnet.layer.ClassificationLayer
properties
% Row vector of weights corresponding to the classes in the
% training data.
ClassWeights
end
methods
function layer = weightedClassificationLayer(classWeights, name)
% layer = weightedClassificationLayer(classWeights) creates a
% weighted cross entropy loss layer. classWeights is a row
% vector of weights corresponding to the classes in the order
% that they appear in the training data.
% layer = weightedClassificationLayer(classWeights, name)
% additionally specifies the layer name
% Set class weights.
layer.ClassWeights = classWeights;
% Set layer name.
if nargin == 2
layer.Name = name;
end
% Set layer description
layer.Description = 'Weighted cross entropy';
end
function loss = forwardLoss(layer, Y, T)
% loss = forwardLoss(layer, Y, T) returns the weighted cross
% entropy loss between the predictions Y and the training
% targets T.
N = size(Y,4);
Y = squeeze(Y);
T = squeeze(T);
W = layer.ClassWeights;
%% Modified %%
n = length(T(:))/2;
W = repelem(W, 1, [n n]);
W = reshape(W, size(T));
prod = W.*(T.*log(Y));
loss = -sum(prod(:))/N;
%% Modified %%
end
function dLdY = backwardLoss(layer, Y, T)
% dLdX = backwardLoss(layer, Y, T) returns the derivatives of
% the weighted cross entropy loss with respect to the
% predictions Y.
[H,Wi,K,N] = size(Y);
Y = squeeze(Y);
T = squeeze(T);
W = layer.ClassWeights;
%% Modified %%
n = length(T(:))/2;
W = repelem(W, 1, [n n]);
W = reshape(W, size(T));
%% Modified %%
dLdY = -(W.*T./Y)/N;
dLdY = reshape(dLdY,[H Wi K N]);
end
end
end
5 comentarios
Divya Gaddipati
el 22 de Oct. de 2019
Generally, such large sizes (like 256, 512) are not recommended to use for checkLayer. To speed up the tests, specify a smaller valid input size.
You can find more information here: https://www.mathworks.com/help/deeplearning/ref/checklayer.html#d117e116206
Más respuestas (0)
Ver también
Categorías
Más información sobre Build Deep Neural Networks en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!