dlgradient: Error Value to differentiate must be a traced dlarray scalar.

63 visualizaciones (últimos 30 días)
I want to train a deep network by Automatic Differentiation. Is these any solution?
layer2 = [
imageInputLayer([9 36 1],'Normalization','none','Name','input1-fcc')
convolution2dLayer([7,7],64,'Name','conv1-fcc')
batchNormalizationLayer('Name','bn1-fcc')
reluLayer('Name','relu1-fcc')
globalAveragePooling2dLayer('Name','pool5-fcc')
fullyConnectedLayer(1,'Name','fc1')];
lgraph = layerGraph(layer2);
dlnet = dlnetwork(lgraph);
% Input
a = rand(9,36,1,10);
a = dlarray(a,'SSCB');
a_pre = forward(dlnet,a);
% output
b = rand(1,10);
loss = mse(a_pre,b);
gradients = dlgradient(loss,dlnet.Learnables);

Respuesta aceptada

Anshika Chaurasia
Anshika Chaurasia el 18 de En. de 2021
Hi Qi Lu,
You can try following code to compute gradients that will resolve your error:
layer2 = [
imageInputLayer([9 36 1],'Normalization','none','Name','input1-fcc')
convolution2dLayer([7,7],64,'Name','conv1-fcc')
batchNormalizationLayer('Name','bn1-fcc')
reluLayer('Name','relu1-fcc')
globalAveragePooling2dLayer('Name','pool5-fcc')
fullyConnectedLayer(1,'Name','fc1')];
lgraph = layerGraph(layer2);
dlnet = dlnetwork(lgraph);
% Input
a = rand(9,36,1,10);
a = dlarray(a,'SSCB');
[loss,gradients] = dlfeval(@compute_gradient,dlnet,a);
function [loss,gradients]=compute_gradient(dlnet,a)
a_pre = forward(dlnet,a);
% output
b = rand(1,10);
loss = mse(a_pre,b);
gradients = dlgradient(dlarray(loss),dlnet.Learnables);%automatic gradient
end
Refer to the following documentation for more information on Automatic Differentiation.

Más respuestas (0)

Etiquetas

Productos


Versión

R2020b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by