How many LSTM blocks are there in bidirectional LSTM layers?
3 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Hi,
How can i relate hidden layers with number of lstm blocks?
inputSize = 17
numHiddenUnits = 50;
numClasses = 2;
maxEpochs = 15;
miniBatchSize = 1;
layers = [ ...
sequenceInputLayer(inputSize)
bilstmLayer(numHiddenUnits,'OutputMode','last')
fullyConnectedLayer(numClasses)
softmaxLayer
classificationLayer]
options = trainingOptions('adam', ...
'ExecutionEnvironment','auto', ...
'GradientThreshold',1, ...
'MaxEpochs',maxEpochs, ...
'MiniBatchSize',miniBatchSize, ...
'SequenceLength','longest', ...
'Shuffle','never', ...
'Verbose',0, ...
'Plots','training-progress');
0 comentarios
Respuestas (1)
Shantanu Dixit
el 20 de Jun. de 2023
Hi Shweta,
Assuming that by hidden layers you mean numHiddenUnits, the numHiddenUnits refer to the number of LSTM blocks per direction (same for both LSTM and BiLSTM). So here the number of LSTM blocks are 50 (numHiddenUnits = 50).
Refer the documentation for lstm and bilstm layer:
0 comentarios
Ver también
Categorías
Más información sobre Define Shallow Neural Network Architectures en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!