- Deep Learning with MATLAB - This is the starting point for deep learning in MATLAB.
- Train Deep Learning Networks in MATLAB - This page provides detailed information on training deep learning networks, including various options and techniques.
- Deep Learning Toolbox Documentation - This is the comprehensive documentation for MATLAB's Deep Learning Toolbox, which covers various aspects of deep learning, including network design, training, and deployment.
Invalid training data. Predictors and responses must have the same number of observations. In trainNetwork function, multiple featureInputLayer, neural network for regression
5 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Farshina Nazrul Shimim
el 29 de Ag. de 2023
Comentada: Yash
el 30 de Ag. de 2023
Hi Everyone,
I am trying to build a multiple-input feedforward neural network for regression. I got the idea of the what the input and output structure should be from here. For the network architecture, I followed this documentation.
Here's my network architecture:
Here's my relevant code with some arbitrary parameters:
clear all
close all
clc
%% Load data
rng("default")
nobs = 24;
numFeatures =[42,6];
numResponses = 1;
XTrain = {rand(numFeatures(1),nobs,"single");rand(numFeatures(2),nobs,"single")};
YTrain = {rand(numResponses,nobs,"single")};
numHiddenUnits = [10,20;15,12;10,5]; % row 1 = NN1 (named S); row 2 = NN2 (named ET); row 3 = NNcombined (no specific name);
dropput_prob = 0.3;
maxEpochs = 10;
initial_LR = 0.001;
%% NN architecture
lgraph = layerGraph;
lgraph = addLayers(lgraph,featureInputLayer(numFeatures(1,1),"Name","layers_S_in"));
lgraph = addLayers(lgraph,featureInputLayer(numFeatures(1,2),"Name","layers_ET_in"));
l_prev_S = "layers_S_in";
l_prev_ET = "layers_ET_in";
for i = 1:length(numHiddenUnits(1,:))
% row 1 = NN1; row 2 = NN2; row 3 = NNcombined;
if numHiddenUnits(1,i)> 0
layername_S = "layers_S_hidden" + string(i);
lgraph = addLayers(lgraph,fullyConnectedLayer(numHiddenUnits(1,i),"Name",layername_S));
lgraph = connectLayers(lgraph,l_prev_S,layername_S);
l_prev_S = layername_S;
% add relulayer
layername_S = "reluLayer_S" + string(i);
lgraph = addLayers(lgraph,reluLayer("Name",layername_S));
lgraph = connectLayers(lgraph,l_prev_S,layername_S);
l_prev_S = layername_S;
end
if numHiddenUnits(2,i)> 0
layername_ET = "layers_ET_hidden" + string(i);
lgraph = addLayers(lgraph,fullyConnectedLayer(numHiddenUnits(2,i),"Name",layername_ET));
lgraph = connectLayers(lgraph,l_prev_ET,layername_ET);
l_prev_ET = layername_ET;
% add relulayer
layername_ET = "reluLayer_ET" + string(i);
lgraph = addLayers(lgraph,reluLayer("Name",layername_ET));
lgraph = connectLayers(lgraph,l_prev_ET,layername_ET);
l_prev_ET = layername_ET;
end
% append dropout if not the last layer
if i < length(numHiddenUnits(1,:))
% add dropoutlayer
layername_S = "dropoutLayer_S" + string(i);
lgraph = addLayers(lgraph,dropoutLayer(dropput_prob,"Name",layername_S));
lgraph = connectLayers(lgraph,l_prev_S,layername_S);
l_prev_S = layername_S;
layername_ET = "dropoutLayer_ET" + string(i);
lgraph = addLayers(lgraph,dropoutLayer(dropput_prob,"Name",layername_ET));
lgraph = connectLayers(lgraph,l_prev_ET,layername_ET);
l_prev_ET = layername_ET;
end
end
concat = concatenationLayer(1,2,'Name','concat');
lgraph = addLayers(lgraph, concat);
lgraph = connectLayers(lgraph, l_prev_S, 'concat/in1');
lgraph = connectLayers(lgraph, l_prev_ET, 'concat/in2');
l_prev = concat.Name;
for i = 1:length(numHiddenUnits(3,:))
if numHiddenUnits(3,i)> 0
layername = "layers_hidden" + string(i);
lgraph = addLayers(lgraph,fullyConnectedLayer(numHiddenUnits(3,i),"Name",layername));
lgraph = connectLayers(lgraph,l_prev,layername);
l_prev = layername;
% add relulayer
layername = "reluLayer" + string(i);
lgraph = addLayers(lgraph,reluLayer("Name",layername));
lgraph = connectLayers(lgraph,l_prev,layername);
l_prev = layername;
end
% append dropout if not the last layer
if i < length(numHiddenUnits(3,:))
% add dropoutlayer
layername = "dropoutLayer" + string(i);
lgraph = addLayers(lgraph,dropoutLayer(dropput_prob,"Name",layername));
lgraph = connectLayers(lgraph,l_prev,layername);
l_prev = layername;
end
end
layername = "outputLayer";
lgraph = addLayers(lgraph,fullyConnectedLayer(numResponses,"Name",layername));
lgraph = connectLayers(lgraph,l_prev,layername);
l_prev = layername;
layername = "regressionLayer";
lgraph = addLayers(lgraph,regressionLayer("Name",layername));
lgraph = connectLayers(lgraph,l_prev,layername);
% visualization of the network
figure
plot(lgraph)
title("FNN architecture")
%% Train
options = trainingOptions('adam', ...
'MaxEpochs',maxEpochs, ...
'InitialLearnRate',initial_LR, ...
'Shuffle','every-epoch', ...
'Plots','training-progress',...
'Verbose',1);
net = trainNetwork(XTrain,YTrain,lgraph,options);
And while running the code in MATLAB R2023a, I am getting the following error:
Can I get some help with figuring out why this error is occurring and what can I do to solve this? Please let me know if any further information is needed. Also, are there any other relevant documentation for training and testing of this type of networks?
Thanks so much for your time and help!
0 comentarios
Respuesta aceptada
Yash
el 29 de Ag. de 2023
The error you're encountering, "Invalid training data. Predictors and responses must have the same numbers of observations," is occurring because the XTrain and YTrain data you're providing to trainNetwork are not compatible in terms of their dimensions.
In your code, you defined XTrain and YTrain as cell arrays, which contain two sets of inputs and one set of outputs. This means you're trying to train a neural network with two separate input branches. However, when using trainNetwork for regression, you typically provide a single input and a single output, not cell arrays.
To fix this issue and train your network, you should concatenate your input data along the appropriate dimension and reshape your output data if necessary. Here's how you can modify your code:
% Combine the two input sets into one
XTrainCombined = cat(1, XTrain{1}, XTrain{2});
% Concatenate the two sets of outputs
YTrainCombined = cat(1, YTrain{1}, YTrain{1});
% Train the network with the combined data
net = trainNetwork(XTrainCombined, YTrainCombined, lgraph, options);
This code combines your two sets of input data and two sets of output data into single matrices that can be used for training. Make sure that XTrainCombined and YTrainCombined have the same number of observations, which should be nobs in your case.
Once you make this change, the training should proceed without the error.
Regarding documentation, MATLAB's official documentation is an excellent resource for learning about neural networks and their training in MATLAB. You can refer to the following documentation pages:
You can explore these resources to gain a deeper understanding of how to work with neural networks in MATLAB.
I hope this helps
2 comentarios
Yash
el 30 de Ag. de 2023
I believe the documentation that you proposed within your comment is one of the many examples that you could refer too. However, to train a multiple-input network, you need to use a single data store that combines both sets of input data. You can achieve this by custom data handling using the arrayDataStore function in matlab.
You can refer the following documentation pages : arrayDataStore
Más respuestas (0)
Ver también
Categorías
Más información sobre Custom Training Loops en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!