Error using trainnet (line 46)
10 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Bahadir
el 15 de Oct. de 2025
Comentada: Bahadir
el 17 de Oct. de 2025
Dear sir;
My XTrain 48941x1 cell and TTrain 48941x1 categorical as a shown at below

why does I get this error?
Error using trainnet (line 46)
Number of observations in predictors (48941) and targets (1) must match. Check that
the data and network are consistent.
layers = [
sequenceInputLayer([30 30 1],'Name','input') % For 2-D image sequence input, InputSize is vector of three elements [h w c], where h is the image height, w is the image width, and c is the number of channels of the image.
convolution2dLayer(5,8,'Stride',1,'Padding','same','WeightsInitializer','he','Name','conv','DilationFactor',1);
batchNormalizationLayer('Name','bn') % A batch normalization layer normalizes a mini-batch of data across all observations for each channel independently. To speed up training of the convolutional neural network and reduce the sensitivity to network initialization, use batch normalization layers between convolutional layers and nonlinearities, such as ReLU layers. After normalization, the layer scales the input with a learnable scale factor γ and shifts it by a learnable offset β.
reluLayer('Name','Relu') % An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.
convolution2dLayer(5,8,'Stride',2,'Padding','same','WeightsInitializer','he','Name','conv','DilationFactor',1);
batchNormalizationLayer('Name','bn') % A batch normalization layer normalizes a mini-batch of data across all observations for each channel independently. To speed up training of the convolutional neural network and reduce the sensitivity to network initialization, use batch normalization layers between convolutional layers and nonlinearities, such as ReLU layers. After normalization, the layer scales the input with a learnable scale factor γ and shifts it by a learnable offset β.
reluLayer('Name','Relu') % An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.
convolution2dLayer(5,8,'Stride',1,'Padding','same','WeightsInitializer','he','Name','conv','DilationFactor',1);
batchNormalizationLayer('Name','bn') % A batch normalization layer normalizes a mini-batch of data across all observations for each channel independently. To speed up training of the convolutional neural network and reduce the sensitivity to network initialization, use batch normalization layers between convolutional layers and nonlinearities, such as ReLU layers. After normalization, the layer scales the input with a learnable scale factor γ and shifts it by a learnable offset β.
reluLayer('Name','Relu') % An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.
convolution2dLayer(5,16,'Stride',2,'Padding','same','WeightsInitializer','he','Name','conv','DilationFactor',1);
batchNormalizationLayer('Name','bn') % A batch normalization layer normalizes a mini-batch of data across all observations for each channel independently. To speed up training of the convolutional neural network and reduce the sensitivity to network initialization, use batch normalization layers between convolutional layers and nonlinearities, such as ReLU layers. After normalization, the layer scales the input with a learnable scale factor γ and shifts it by a learnable offset β.
reluLayer('Name','Relu') % An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.
convolution2dLayer(5,16,'Stride',1,'Padding','same','WeightsInitializer','he','Name','conv','DilationFactor',1);
batchNormalizationLayer('Name','bn') % A batch normalization layer normalizes a mini-batch of data across all observations for each channel independently. To speed up training of the convolutional neural network and reduce the sensitivity to network initialization, use batch normalization layers between convolutional layers and nonlinearities, such as ReLU layers. After normalization, the layer scales the input with a learnable scale factor γ and shifts it by a learnable offset β.
reluLayer('Name','Relu') % An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.
convolution2dLayer(5,32,'Stride',2,'Padding','same','WeightsInitializer','he','Name','conv','DilationFactor',1);
batchNormalizationLayer('Name','bn') % A batch normalization layer normalizes a mini-batch of data across all observations for each channel independently. To speed up training of the convolutional neural network and reduce the sensitivity to network initialization, use batch normalization layers between convolutional layers and nonlinearities, such as ReLU layers. After normalization, the layer scales the input with a learnable scale factor γ and shifts it by a learnable offset β.
reluLayer('Name','Relu') % An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.
globalAveragePooling2dLayer(Name="gap1")
fullyConnectedLayer(7)
softmaxLayer];
options = trainingOptions("adam", ...
MaxEpochs=4, ...
InitialLearnRate=0.002,...
MiniBatchSize=128,...
GradientThreshold=1, ...
LearnRateSchedule="piecewise", ...
LearnRateDropPeriod=20, ...
LearnRateDropFactor=0.8, ...
L2Regularization=1e-3,...
Shuffle="every-epoch", ...
Plots="training-progress", ...
ObjectiveMetricName="loss", ...
OutputNetwork="best-validation", ...
ValidationPatience=5, ... % Specify the validation patience as 5 so training stops if the recall has not decreased for five iterations.
ValidationFrequency=50, ...
Verbose=false, ...
Metrics="accuracy", ...
ValidationData={XValidation,TValidation});
net = trainnet(XTrain,TTrain,layers,"crossentropy",options);
3 comentarios
Walter Roberson
el 15 de Oct. de 2025
In order to test we would need corresponding XValidation and TValidation
Respuesta aceptada
Matt J
el 16 de Oct. de 2025
Editada: Matt J
el 16 de Oct. de 2025
It appears that if your XTrain is in cell array form, you need to put your TTrain data in cell form as well:
load('attachedData.mat'); clear ans; whos %Inventory
TTrain=num2cell(TTrain);
options.Plots='none'; %Online environment doesn't support plots
options.Verbose=true;
options.ValidationData={XTrain,TTrain}; %Fake validation data
testPrediction=minibatchpredict(dlnetwork(layers), XTrain(1:3)) %test
net = trainnet(XTrain,TTrain,layers,"crossentropy",options);
2 comentarios
Matt J
el 16 de Oct. de 2025
Editada: Matt J
el 16 de Oct. de 2025
You are using a sequenceInputLayer, but your training inputs appear to just be 30x30 images. An imageInputLayer might be more appropriate...
load('attachedData.mat');
XTrain=cat(4,XTrain{:});
layers(1)=imageInputLayer([30,30,1],Name="input");
options.Plots='none'; %Online environment doesn't support plots
options.Verbose=true;
options.ValidationData={XTrain,TTrain}; %Fake validation data
testPrediction=minibatchpredict(dlnetwork(layers), XTrain(:,:,:,1:3)) %test
net = trainnet(XTrain,TTrain,layers,"crossentropy",options);
Más respuestas (0)
Ver también
Categorías
Más información sobre Image Data Workflows en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!