Basic question regarding LSTM neural network

3 visualizaciones (últimos 30 días)
Pai-Feng Teng
Pai-Feng Teng el 10 de Jun. de 2021
Respondida: Shubham el 21 de Feb. de 2024
I currently tried to modify codes from the others to simulate the rainfall-runoff model with the LSTM neural networks.
I have reviewed your page many times regarding the definition of each steps, in order to make sure my process is accurate.
However, there are parameters I can never find when the computer engineers asked me to explain my code.
1) How do I even set up the backtracking time for the LSTM, more specifically, how do I know the duration of cell memory of the LSTM I set up? I looked up the entire page and I can't find any information about it.
2) Is it possible to set up the Evolutionary Attention-based LSTM (EA-LSTM with the matlab)?

Respuestas (1)

Shubham
Shubham el 21 de Feb. de 2024
Hi Pai,
The concept of "backtracking time" or the duration of cell memory in an LSTM isn't explicitly set as a parameter but is determined by the architecture of the LSTM and the sequence length of the input data. In the context of LSTMs, this is typically referred to as the number of time steps that the LSTM looks back, which is indirectly controlled by the following:
  1. When preparing your data for training an LSTM, you decide on the sequence length, which is the number of time steps the LSTM will consider for each input sequence. This is the direct way you control how much "memory" the LSTM has, as it will be able to learn from the data within that sequence length.
  2. In MATLAB, you can specify whether an LSTM is stateful or stateless. A stateful LSTM retains its state (cell state and hidden state) across batches, allowing it to maintain memory over longer sequences than the sequence length. A stateless LSTM resets its state after each batch.
  3. The batch size can also impact how the LSTM learns temporal dependencies. Smaller batch sizes can lead to more frequent updates of the weights, potentially capturing finer temporal details.
MATLAB's Deep Learning Toolbox provides a lot of flexibility and allows you to define custom layers and functionalities.To implement an EA-LSTM, you would need to:
  1. Ensure you have a thorough understanding of the EA-LSTM architecture and how it differs from a standard LSTM.
  2. You may need to create custom layers if the EA-LSTM requires mechanisms that are not available in the standard LSTM layer provided by MATLAB.
  3. Implement the attention mechanism that is a part of the EA-LSTM. This may involve creating a custom attention layer that can be integrated with the LSTM.
  4. Define the training process, loss functions, and any other specifics required for the EA-LSTM.

Categorías

Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.

Productos


Versión

R2021a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by