Slow checkLayer-function for simple custom reshape layer

2 visualizaciones (últimos 30 días)
Julius Å
Julius Å el 22 de Jul. de 2019
Comentada: faris azhari el 6 de Nov. de 2019
I'm trying to create a simple layer to use directly after my imageInputLayer in a neural network. The purpose of the layer is to channel-split the image-input. The input is a three-channel image, where the first channel contains grayscale image data, the middle channel is a zeros-matrix and the third channel contains a few parameters that are later extracted into a 1x11x1 vector by using a max-pooling layer. The purpose of doing this is to be able to use both image and parameter inputs in my neural network. The code for creating the custom layer is as follows:
classdef splitLayer < nnet.layer.Layer
properties
inputSize;
outputSize;
end
methods
function layer = splitLayer(Name)
%Initialize output and input sizes
layer.Name = Name;
layer.inputSize = [256 256 3];
layer.outputSize = [256 256 1];
layer.NumOutputs = 2;
end
function [Z1, Z2] = predict(layer, X)
numObservations = size(X,4);
%Split channels and return as separate outputs
Z1 = zeros([layer.outputSize numObservations], 'like', X);
Z2 = zeros([layer.outputSize numObservations], 'like', X);
for i = 1:numel(numObservations)
Z1(:, :, :, i) = X(:, :, 1, i);
Z2(:, :, :, i) = X(:, :, 3, i);
end
end
function [Z1, Z2, memory] = forward(layer, X)
numObservations = size(X,4);
%Split channels and return as separate outputs
Z1 = zeros([layer.outputSize numObservations], 'like', X);
Z2 = zeros([layer.outputSize numObservations], 'like', X);
for i = 1:numel(numObservations)
Z1(:, :, :, i) = X(:, :, 1, i);
Z2(:, :, :, i) = X(:, :, 3, i);
end
memory = numObservations;
end
function [dLdX] = backward(layer, ~, ~, ~, dLdZ1, dLdZ2, memory)
numObservations = size(dLdZ1, 4);
is = layer.inputSize;
dLdX = zeros(is(1), is(2), 3, numObservations, 'like', dLdZ1);
zeromat = zeros(is(1), is(2), 'like', dLdZ1);
%For each observation dimension, concatenate separated dimensions and pass backwards
for i = 1:numel(numObservations)
dLdX(:,:,:,i) = cat(3, dLdZ1(:, :, :, i), zeromat, dLdZ2(:, :, :, i));
end
end
end
end
However, when using checkLayer to validate my custom layer, the checkLayer-function is very slow, and by some debugging I have realized that it is slow at computing the derivatives in the backward function.
Since this is only my second custom layer, I'm not sure where the problem lies. Could it be the use of the cat-function? In that case, how do I implement the same functionality differently in a way that functions faster on a GPU?
  2 comentarios
Valentin Steininger
Valentin Steininger el 9 de Ag. de 2019
Hi
I also tried to create such a splitData layer with 4 Outputs. When I run my code I get the error thrown: "Incorrect number of output arguments for 'predict' in Layer splitDataLayer. Expected to have1, but instead it has 4."
Then I tried to run it with your code to see what happens and it throws the same error although the numOutputs property has been set properly. So this might be a release issue.
May I ask what release version you are using for that layer?
faris azhari
faris azhari el 6 de Nov. de 2019
Hi,
Unrelated to the main question, why did you do:
for i = 1:numel(numObservations)
...
end
Wouldn't it make the loop run only once? Since:
numObservations = size(X,4);
Would always give an array of size [1 1].
Should it be:
for i = 1:numObservations
...
end
instead? Or is it how its supposed to be when defining a custom layer?

Iniciar sesión para comentar.

Respuestas (1)

Divya Gaddipati
Divya Gaddipati el 5 de Ag. de 2019
While using the checkLayer function, for large input sizes, the gradient checks take longer time to run. To speed up the tests, try to specify a smaller valid input and output sizes (like for example [24 24 3] or [5 5 3]).

Categorías

Más información sobre Parallel and Cloud en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by