error while transferring weights of a trained CNN network to an empty CNN network
1 visualización (últimos 30 días)
Mostrar comentarios más antiguos
Radians
el 19 de Feb. de 2020
Comentada: Radians
el 15 de En. de 2021
Hi,
I am trying to transfer the weights of layer 11 from 'original_net' to layer 11 of 'layers_final'. Both have same structure and 'layer_final' is just the empty, untrained version of 'original net'. i am using the following command:
Layers_final(11).Weights = net_1.Layers(11).Weights
I get the following error while doing so:
Error using nnet.cnn.layer.TransposedConvolution2DLayer/set.Weights (line 204)
Expected input to be of size 4x4x8x1, but it is of size 4x4x8x8.
code for layers_final:
imageLayer_final = imageInputLayer([32,32,1]);
encodingLayers_final = [ ...
convolution2dLayer(3,16,'Padding','same'), ...
reluLayer, ...
maxPooling2dLayer(2,'Padding','same','Stride',2), ...
convolution2dLayer(3,8,'Padding','same'), ...
reluLayer, ...
maxPooling2dLayer(2,'Padding','same','Stride',2), ...
convolution2dLayer(3,8,'Padding','same'), ...
reluLayer, ...
maxPooling2dLayer(2,'Padding','same','Stride',2)];
decodingLayers_final = [ ...
createUpsampleTransponseConvLayer(2,8), ...
reluLayer, ...
createUpsampleTransponseConvLayer(2,8), ...
reluLayer, ...
createUpsampleTransponseConvLayer(2,16), ...
reluLayer, ...
convolution2dLayer(3,1,'Padding','same'), ...
clippedReluLayer(1.0), ...
regressionLayer];
layers_final = [imageLayer,encodingLayers,decodingLayers];
net_original attached with the question.
Thanks
0 comentarios
Respuesta aceptada
Srivardhan Gadila
el 24 de Feb. de 2020
If the function createUpsampleTransponseConvLayer is the helper function from the example Prepare Datastore for Image-to-Image Regression then change the 'NumChannels' Name-Value Pair Argument to 'auto' or don't mention it in the transposedConv2dLayer function.
% helper function from the example Prepare Datastore for Image-to-Image Regression
function out = createUpsampleTransponseConvLayer(factor,numFilters)
filterSize = 2*factor - mod(factor,2);
cropping = (factor-mod(factor,2))/2;
numChannels = 1;
out = transposedConv2dLayer(filterSize,numFilters, ...
'NumChannels',numChannels,'Stride',factor,'Cropping',cropping);
end
Since the layer is defined with 'NumChannels' (number of channels of the input to this transposedConv2dLayer) as 1 hence it can accept wieghts of size "filterSize x filterSize x numFilters x numChannels" which is 4x4x8x1 in this case.
Change the function as follows:
function out = createUpsampleTransponseConvLayer(factor,numFilters)
filterSize = 2*factor - mod(factor,2);
cropping = (factor-mod(factor,2))/2;
out = transposedConv2dLayer(filterSize,numFilters, ...
'Stride',factor,'Cropping',cropping);
end
and then define the layers.
3 comentarios
Srivardhan Gadila
el 9 de Mzo. de 2020
I have heard that this issue is known and the concerned parties might be working on it.
Más respuestas (0)
Ver también
Categorías
Más información sobre Get Started with Deep Learning Toolbox en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!