Problem with resNet18 on multi-spectral image segmentation

Hi, i was trying to deploy multi-spectral image segmentation with resnet18 refering to this answer resnet50-on-multi-spectral-image-segmentation. When it trains with rgb images, the performance is not bad. Then i change the images into multi-spectral images(array size are 1024*1024*151), but i got the following error "Invalid training data. The output size (3) of the last layer does not match the number of classes of the responses (1)."
Can anyone help me on that?
imageDir = 'C:\dataset\train_images';
imds = MySequenceDatastore(imageDir);
classNames = ["norm","low","high"];
labelDir = 'C:\dataset\PixelLabelData';
labelIDs = [0,1,2];
pxds = pixelLabelDatastore(labelDir,classNames,labelIDs);
imageSize = [1024 1024 3];
N = 151;
numClasses = numel(classNames);
lgraph = deeplabv3plusLayers(imageSize, numClasses, "resnet18");
analyzeNetwork(lgraph)
layers = lgraph.Layers;
newlgraph = replaceLayer(lgraph,'data',imageInputLayer([1024 1024 N],'Name','input'));
newlgraph = replaceLayer(newlgraph,'conv1',convolution2dLayer(7,64,'stride',[2 2],'padding',[3 3 3 3],'Name','conv1'));
analyzeNetwork(newlgraph)
trainData = combine(imds,pxds);
opts = trainingOptions('sgdm',...
"ExecutionEnvironment","gpu",...
"InitialLearnRate",0.001,...
'MiniBatchSize',16,...
"Plots","training-progress",...
'MaxEpochs',30);
[net,info] = trainNetwork(trainData,newlgraph,opts);
%% function MySequenceDatastore
classdef MySequenceDatastore < matlab.io.Datastore & ...
matlab.io.datastore.MiniBatchable
properties
Datastore
Labels
NumClasses
SequenceDimension
MiniBatchSize
end
properties(SetAccess = protected)
NumObservations
end
properties(Access = private)
% This property is inherited from Datastore
CurrentFileIndex
end
methods
function ds = MySequenceDatastore(folder)
% Construct a MySequenceDatastore object
% Create a file datastore. The readSequence function is
% defined following the class definition.
fds = fileDatastore(folder, ...
'ReadFcn',@readSequence, ...
'IncludeSubfolders',true);
ds.Datastore = fds;
% Read labels from folder names
numObservations = numel(fds.Files);
for i = 1:numObservations
file = fds.Files{i};
filepath = fileparts(file);
[~,label] = fileparts(filepath);
labels{i,1} = label;
end
ds.Labels = categorical(labels);
ds.NumClasses = numel(unique(labels));
% Determine sequence dimension. When you define the LSTM
% network architecture, you can use this property to
% specify the input size of the sequenceInputLayer.
X = preview(fds);
ds.SequenceDimension = size(X,1);
% Initialize datastore properties.
ds.MiniBatchSize = 1; %128
ds.NumObservations = numObservations;
ds.CurrentFileIndex = 1;
end
function dsNew = shuffle(ds)
% dsNew = shuffle(ds) shuffles the files and the
% corresponding labels in the datastore.
% Create a copy of datastore
dsNew = copy(ds);
dsNew.Datastore = copy(ds.Datastore);
fds = dsNew.Datastore;
% Shuffle files and corresponding labels
numObservations = dsNew.NumObservations;
idx = randperm(numObservations);
fds.Files = fds.Files(idx);
dsNew.Labels = dsNew.Labels(idx);
end
function tf = hasdata(ds)
% Return true if more data is available
tf = ds.CurrentFileIndex + ds.MiniBatchSize - 1 ...
<= ds.NumObservations;
end
function [data,info] = read(ds)
% Read one mini-batch batch of data
miniBatchSize = ds.MiniBatchSize;
info = struct;
for i = 1:miniBatchSize
predictors{i,1} = read(ds.Datastore);
responses(i,1) = ds.Labels(ds.CurrentFileIndex);
ds.CurrentFileIndex = ds.CurrentFileIndex + 1;
end
data = preprocessData(ds,predictors,responses);
end
function data = preprocessData(ds,predictors,responses)
% data = preprocessData(ds,predictors,responses) preprocesses
% the data in predictors and responses and returns the table
% data
miniBatchSize = ds.MiniBatchSize;
% Pad data to length of longest sequence.
sequenceLengths = cellfun(@(X) size(X,2),predictors);
maxSequenceLength = max(sequenceLengths);
for i = 1:miniBatchSize
X = predictors{i};
% Pad sequence with zeros.
if size(X,2) < maxSequenceLength
X(:,maxSequenceLength) = 0;
end
predictors{i} = X;
end
% Return data as a table.
data = table(predictors,responses);
end
function reset(ds)
% Reset to the start of the data
reset(ds.Datastore);
ds.CurrentFileIndex = 1;
end
end
methods (Hidden = true)
function frac = progress(ds)
% Determine percentage of data read from datastore
frac = (ds.CurrentFileIndex - 1) / ds.NumObservations;
end
end

2 comentarios

It would be easiest if you just attached newlgraph and trainData in a .mat file
Also, attach an instance of training and response data by using trainData.read().

Iniciar sesión para comentar.

Respuestas (1)

Vinayak Choyyan
Vinayak Choyyan el 14 de Dic. de 2023
Hi,
From a quick look at the code, it looks like your custom datastore is returning a table of predictors and responses. Later you are then combining imds, the custom datastore, with pxds, a pixelLabelDatastore, which is also a response datastore. This might be unintended and the reason for the error.
The error message you are getting comes when the model you are trying to train expects a certain number of classes (3 in your case) but the data passed to it for training has only one class. I suggest using read(trainData) first to check if your training datastore is indeed reading what you expected it to show.

9 comentarios

Hi there! Thanks for ur help!
After using read(trainData), i found that my datastore has some problems. There is an extra value "responses" which is not supposed to be there.
I also check my previous datastore (rgb images). The data form is quite different.
Is there a method can make my multi-spectral datastore the same data form as the rgb one? It is appreciated for answering my question in your busy time!
As you have not shared any data or data description, I am going to assume it for the below example.
Let us say your data is '.tif' file and is folder '/data/predictors' and the pixel labels are also '.tif' file and are in folder '/data/labels'.
For reading hyperspectral data, which would be the predictors:
root='/data/predictors';
imds=imageDatastore(root,"FileExtensions",".tif",ReadFcn=@customReader);%customReader most likely will be readSequence() mentioned in your code
function out=customReader(filename)
out=imread(filename);%replace this with custom code to read a single
% hyperspectral data depending on how your data is saved.
%You could also use hypercube function in MATLAB to read hyperspectral
%data and then do
hcube=hypercube(filename);
out=hcube.DataCube;
%NOTE: the 2 code regions in this function is either or and I am leaving
% it for your explanation. Do not keep both.
end
%place the function block at the end of your script
For reading label data, which would be response:
labelroot='/data/labels';
classNames=["norm","low","high"];
classLabels=0:2;
pxds = pixelLabelDatastore(labelroot,classNames,classLabels,FileExtensions='.tif');
This way you dont need to make a custom datastore to handle batch size and other stuff.
trainData = combine(imds,pxds);
You can also easily wrap imds to transform function to apply any transformation/augmentation to the data being read.
Hope this helps.
Excuse me for my carelessness, i forgot to provide my data description.
I have 60 sets of hyperspectral images, each set has 151 '.tif' files representing 151 spectral bands. And the size of each image is 1024*1024. I have already stored these '.tif' files into 60 '.mat' files (array size 1024*1024*151) as my traindata.
So what should i do in data reading?
In this case, since you have converted the data to 60 .mat files, you can change the imageDatastore input parameters like this
imds=imageDatastore(root,"FileExtensions",".mat",ReadFcn=@customReader);
function out=customReader(filename)
data=load(filename);
out=data.data;
end
As per my understanding, there are 60 folders and each folder has 151 images in 'tif' file. These 151 images when stacked together makes the hyperspectral image. For your future reference, I will also include one more way of doing the data reading. In this method, you need not go through the effort of data conversion to .mat files.:
imds=imageDatastore(root,"IncludeSubfolders",true,"FileExtensions",".tif",ReadSize=151);
hyperImds=transform(imds,@stackImagesCustom)
function out=stackImagesCustom(x)
out=cat(3,x{:});
end
then create the pixelLabelDatastore like before and combine it with above hyperImds.
labelroot='/data/labels';
classNames=["norm","low","high"];
classLabels=0:2;
pxds = pixelLabelDatastore(labelroot,classNames,classLabels,FileExtensions='.tif');
trainData = combine(hyperImds,pxds);
You can read more on imageDatastore here Datastore for image data - MATLAB - MathWorks India and on transform function here Transform datastore - MATLAB transform - MathWorks India.
Thank u very much for your help!!!
But i still got a problem. The datastore is now what i expected it to show, but i still got an error.
'Error using trainNetwork
arguments dimensions are not consistent.
Error using cat
CAT arguments dimensions are not consistent.'
I was very confused.
This error is coming from the line out=cat(3,x{:}); within our transform function. Looks like one or more of your 151*60 images are not of the same size. Could you confirm the same?
Below code might help check for the same:
imds=imageDatastore(root,"IncludeSubfolders",true,"FileExtensions",".tif","ReadFcn",@sizeCheckerCustom);
readall(imds);
function out=sizeCheckerCustom(filename)
out=imread(filename);
if(~all(size(out)==[1024 1024]))
disp(filename)
out=zeros(1024,1024);
end
end
For faster replies from a dedicated team, consider creating a new technical service request at Contact Us - MATLAB & Simulink (mathworks.com) .
It turns out that i placed some images mistakenly in the dataset. According to your suggestion, i have modified the dataset.
However, it still reports error.
"Error using trainNetwork
Arrays have incompatible sizes for this operation."
Why would that happen? Could you explain? Thanks a lot.
This seems like a data issue. There is a mismatch in array size. The basic issue here is that when trainNetwork is trying to read all the data, some hyperspectral cubes are giving different sizes. You will have to check this out yourself as I do not have the data.
I totally understand what u mean. It would be even better if u could provide some ideas to check the data.
Thank u for your help in these days.

Iniciar sesión para comentar.

Categorías

Preguntada:

el 13 de Dic. de 2023

Comentada:

el 15 de Dic. de 2023

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by