Error on Multiple feature Input layers

7 visualizaciones (últimos 30 días)
Chih
Chih el 2 de Feb. de 2023
Respondida: Vinayak Choyyan el 15 de Feb. de 2023
I created a simple network with 4 featureInputs. The datastore and network were in the following program.
However, when my program call "trainNetwork", I got an error message:
"Input datastore returned more than one observation per row for network input 1."
I know there are few similar questions being asked in the past. But, no one points out exactly where is the problem in the code. Can someone help me take a look and teach me where is the problem?
Thanks.
%%
NumVars = 3;
NumInputs = 4;
TotRows = 2;
%% while hasdata(ds_1), read(ds_1), end
ds_1 = arrayDatastore(zeros(TotRows, NumVars));
ds_2 = arrayDatastore(zeros(TotRows, NumVars - 1));
ds_3 = arrayDatastore(zeros(TotRows, NumVars));
ds_4 = arrayDatastore(zeros(TotRows, NumVars - 1));
TargetY = zeros(TotRows, 1);
ds_TargetY = arrayDatastore(TargetY);
dsTrain = combine(ds_1, ds_2, ds_3, ds_4, ds_TargetY);
%%
lgraph = layerGraph();
for i = 1 : NumInputs
if i == NumInputs
UsedVars = NumVars - 1;
else
UsedVars = NumVars;
end
NameStrIn = ['In_' num2str(i)];
inLayer = featureInputLayer(UsedVars, 'Name', NameStrIn);
lgraph = addLayers(lgraph, inLayer);
NameStrFC = ['FC_' num2str(i)];
fcLayer = fullyConnectedLayer(UsedVars, 'Name', NameStrFC);
lgraph = addLayers(lgraph, fcLayer);
lgraph = connectLayers(lgraph, ['In_' num2str(i)], ['FC_' num2str(i)]);
end
concatLayer = concatenationLayer(1, NumInputs, 'Name', 'Concat');
lgraph = addLayers(lgraph, concatLayer);
for CurIdx = 1 : NumInputs
lgraph = connectLayers(lgraph, ['FC_' num2str(CurIdx)], ['Concat' '/in' num2str(CurIdx)]);
end
Final_Layers = [
fullyConnectedLayer(2, 'Name', 'Final_Layers');
fullyConnectedLayer(1)
regressionLayer];
lgraph = addLayers(lgraph, Final_Layers);
lgraph = connectLayers(lgraph, 'Concat', 'Final_Layers');
analyzeNetwork(lgraph)
%%
options = trainingOptions("sgdm", ...
MaxEpochs=15, ...
InitialLearnRate=0.01, ...
Plots="training-progress", ...
Verbose=0);
net = trainNetwork(dsTrain, lgraph, options);
  1 comentario
Chih
Chih el 2 de Feb. de 2023
Slightly updated program:
NumVars = 3;
NumInputs = 4;
TotRows = 2;
%% while hasdata(ds_1), read(ds_1), end
ds_1 = arrayDatastore(zeros(TotRows, NumVars));
ds_2 = arrayDatastore(zeros(TotRows, NumVars));
ds_3 = arrayDatastore(zeros(TotRows, NumVars));
ds_4 = arrayDatastore(zeros(TotRows, NumVars - 1));
TargetY = zeros(TotRows, 1);
ds_TargetY = arrayDatastore(TargetY);
dsTrain = combine(ds_1, ds_2, ds_3, ds_4, ds_TargetY);
%%
lgraph = layerGraph();
for i = 1 : NumInputs
if i == NumInputs
UsedVars = NumVars - 1;
else
UsedVars = NumVars;
end
NameStrIn = ['In_' num2str(i)];
inLayer = featureInputLayer(UsedVars, 'Name', NameStrIn);
lgraph = addLayers(lgraph, inLayer);
NameStrFC = ['FC_' num2str(i)];
fcLayer = fullyConnectedLayer(UsedVars, 'Name', NameStrFC);
lgraph = addLayers(lgraph, fcLayer);
lgraph = connectLayers(lgraph, ['In_' num2str(i)], ['FC_' num2str(i)]);
end
concatLayer = concatenationLayer(1, NumInputs, 'Name', 'Concat');
lgraph = addLayers(lgraph, concatLayer);
for CurIdx = 1 : NumInputs
lgraph = connectLayers(lgraph, ['FC_' num2str(CurIdx)], ['Concat' '/in' num2str(CurIdx)]);
end
Final_Layers = [
fullyConnectedLayer(2, 'Name', 'Final_Layers');
fullyConnectedLayer(1)
regressionLayer];
lgraph = addLayers(lgraph, Final_Layers);
lgraph = connectLayers(lgraph, 'Concat', 'Final_Layers');
analyzeNetwork(lgraph)
%%
options = trainingOptions("sgdm", ...
MaxEpochs=15, ...
InitialLearnRate=0.01, ...
Plots="training-progress", ...
Verbose=0);
net = trainNetwork(dsTrain, lgraph, options);

Iniciar sesión para comentar.

Respuestas (1)

Vinayak Choyyan
Vinayak Choyyan el 15 de Feb. de 2023
Hello Chih,
As per my understanding, you have designed a neural network with 4 ‘featureInputLayer’. As dummy data, you are providing the model with 3 ‘arrayDatastore’of size 2x3 and a 4th ‘arrayDatastore’ of size 2x2. TargetY is the output of size 2x1. I assume these 2 rows are separate observations and hence making the input to the model of size 11 and output of size 1. To feed the training data to the model, you have used ‘combine’ to create a ‘CombinedDatatore’.
When you try to train a network using ‘trainNetwork’, it reads data from the ‘CombinedDatatore’. If we check what is being returned from the ‘CombinedDatatore’, we can see that it returns the following.
NumVars = 3;
NumInputs = 4;
TotRows = 2;
ds_1 = arrayDatastore(zeros(TotRows, NumVars));
ds_2 = arrayDatastore(zeros(TotRows, NumVars));
ds_3 = arrayDatastore(zeros(TotRows, NumVars));
ds_4 = arrayDatastore(zeros(TotRows, NumVars - 1));
TargetY = zeros(TotRows, 1);
ds_TargetY = arrayDatastore(TargetY);
dsTrain = combine(ds_1, ds_2, ds_3, ds_4, ds_TargetY);
read(dsTrain)
ans = 1×5 cell array
{[0 0 0]} {[0 0 0]} {[0 0 0]} {[0 0]} {[0]}
There are 4 inputs and 1 output to the model. Hence ‘trainNetwork’ is expecting to get 4+1 = 5 data points. You can see that the first 3 elements are cells of size 1x3. But the model is expecting 3x1.
As a workaround for this issue you faced, please try the below code. I have only modified the 5 ‘arrayDatastore’ you created to return the expected input size data. The rest is the same.
clc;clear;
NumVars = 3;
NumInputs = 4;
TotRows = 2;
%% while hasdata(ds_1), read(ds_1), end
tmp=zeros(TotRows, NumVars);
tmp1=zeros(TotRows, NumVars-1);
ds_1 = arrayDatastore(tmp',"IterationDimension",2);
ds_2 = arrayDatastore(tmp',"IterationDimension",2);
ds_3 = arrayDatastore(tmp',"IterationDimension",2);
ds_4 = arrayDatastore(tmp1',"IterationDimension",2);
TargetY = zeros(TotRows, 1);
ds_TargetY = arrayDatastore(TargetY',"IterationDimension",2);
dsTrain = combine(ds_1, ds_2, ds_3, ds_4, ds_TargetY);
%%
lgraph = layerGraph();
for i = 1 : NumInputs
if i == NumInputs
UsedVars = NumVars - 1;
else
UsedVars = NumVars;
end
NameStrIn = ['In_' num2str(i)];
inLayer = featureInputLayer(UsedVars, 'Name', NameStrIn);
lgraph = addLayers(lgraph, inLayer);
NameStrFC = ['FC_' num2str(i)];
fcLayer = fullyConnectedLayer(UsedVars, 'Name', NameStrFC);
lgraph = addLayers(lgraph, fcLayer);
lgraph = connectLayers(lgraph, ['In_' num2str(i)], ['FC_' num2str(i)]);
end
concatLayer = concatenationLayer(1, NumInputs, 'Name', 'Concat');
lgraph = addLayers(lgraph, concatLayer);
for CurIdx = 1 : NumInputs
lgraph = connectLayers(lgraph, ['FC_' num2str(CurIdx)], ['Concat' '/in' num2str(CurIdx)]);
end
Final_Layers = [
fullyConnectedLayer(2, 'Name', 'Final_Layers');
fullyConnectedLayer(1)
regressionLayer];
lgraph = addLayers(lgraph, Final_Layers);
lgraph = connectLayers(lgraph, 'Concat', 'Final_Layers');
analyzeNetwork(lgraph)
%%
options = trainingOptions("sgdm", ...
MaxEpochs=15, ...
InitialLearnRate=0.01, ...
Plots="training-progress", ...
Verbose=0);
net = trainNetwork(dsTrain, lgraph, options);
If you would like to read more on the following, please check out these documentation pages:
I hope this resolves the issue you are facing.

Productos


Versión

R2022b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by