Borrar filtros
Borrar filtros

How to create LSTM network of multiple dimension

11 visualizaciones (últimos 30 días)
Vinay Kulkarni
Vinay Kulkarni el 10 de Mzo. de 2023
Comentada: Vinay Kulkarni el 15 de Mzo. de 2023
Hi,
My input dimension for neural network is : 653956x32x16. Out of this matrix dimension, 32x16 is Input size, that i have given for layer formation as below :
layers=[
sequenceInputLayer(input_size_network)
flattenLayer
lstmLayer(32,'Name','LSTM1')
layerNormalizationLayer('Name','LSTM1_layernorm')
lstmLayer(16,'Name','LSTM2')
layerNormalizationLayer('Name','LSTM2_layernorm')
lstmLayer(8,'Name','LSTM3')
layerNormalizationLayer('Name','LSTM3_layernorm')
flattenLayer('Name','Flatten_layer')
fullyConnectedLayer(48,'Name','FC1')
fullyConnectedLayer(16,'Name','FC2')
fullyConnectedLayer(3,'Name','FC3')
softmaxLayer
classificationLayer
]
options = trainingOptions("adam","MaxEpochs",50,"SequencePaddingDirection","left",InitialLearnRate=0.001,Shuffle="every-epoch",Plots="training-progress",Verbose=0);
cv = cvpartition(size(X,1),'HoldOut',0.2);
idx = cv.test;
XTrain = X(~idx,:,:);
YTrain =Y(~idx,:);
XTest = X(idx,:,:);
YTest =Y(idx,:);
net =trainNetwork(XTrain,YTest,layers,options);
But when i try giving XTrain, which is of the same dimension 653956x32x16 , for training network i get following error.
Error using trainNetwork (line 184)
The training sequences are of feature dimension 653956 32 but the input layer expects sequences of feature
dimension 32 16.
Can you please help me on how i can pass input, as i wanted to pass LSTM with 16 as feature and 32 as time sequence.
Note : X or Xtrain is pure Matrix array and not cell.

Respuestas (1)

Ben
Ben el 13 de Mzo. de 2023
It appears your data is in (Batch) x (Sequence) x (Features) format. For trainNetwork you need to represent you sequence data as a (Batch) x 1 cell array where each entry is a (Features) x (Sequence) array. For example here is how to transform an X of the size you have to what is necessary:
batchSize = 653956;
sequenceLength = 32;
numFeatures = 16;
X = randn(batchSize,sequenceLength,numFeatures);
% trainNetwork needs each sequence to be in (Features) x (Sequence) shape, so permute these dimensions to the front.
X = permute(X,[3,2,1]);
% trainNetwork needs sequence data to be represented as a cell-array
X = num2cell(X,[1,2]);
X = X(:);
As a side note, since you are using sequences of vectors as input, the flattenLayer-s are unnecessary.
  3 comentarios
Ben
Ben el 14 de Mzo. de 2023
> the assumption of time sequence is automatically taken care
Yes, you do not need to specify the sequence length in sequenceInputLayer - in fact you can have variable sequence lengths in your data and sequenceInputLayer handles this.
>LSTM cell's hidden units need to pass return sequences
This is the default for us: lstmLayer(32,"Name","LSTM1") will return the sequence of hidden states, it is setting the "OutputMode" to the default value "sequence".
I notice your Keras LSTM also has return_state=True. We support this as well via the "HasStateOutputs" name-value pair, so you might want something like:
layer = lstmLayer(32,"Name","LSTM1","HasStateOutputs",true);
This will output the states from the 2nd output of the layer.
Vinay Kulkarni
Vinay Kulkarni el 15 de Mzo. de 2023
Thanks Ben.
But i am wondering why i am getting error for
lstmLayer(32,"Name","LSTM1")

Iniciar sesión para comentar.

Categorías

Más información sobre Image Data Workflows en Help Center y File Exchange.

Productos


Versión

R2021b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by