Setting LSTM time serie prediction

3 visualizaciones (últimos 30 días)
Bastien Wioland
Bastien Wioland el 19 de Abr. de 2021
Respondida: Asvin Kumar el 11 de Mayo de 2021
Hello,
I am currently working on the implementation of time series prediction using LSTM networks. I have well undersdand the "chicken pox" example provided by Matworks but something remains very unclear to me.
Let's take the following example, I want to do the following time serie prediction : [0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15].
Thus I would set XTrain = [0 1 2 3 4 5 6 7 8 9 10 11] and YTrain = [1 2 3 4 5 6 7 8 9 10 11 12].
In this case when the neworks tends to predict a value, for example 12, does it takes into account numerous inputs (for example [9 10 11]), or only the corresponding value into the XTrain set, i.e. 11.
Consequently for a future prediction (not training), does the size of the XTest value (inside "predictAndUpdateState(net,XTest,'ExecutionEnvironment','cpu')") has an impact on the prediction, or it is better to set manually the number of inputs for each prediction by setting:
XTrain = [0 1 2 3; 1 2 3 4; 2 3 4 5; ...] and YTrain=[4 5 6 ...]
Thank you very much for your help!

Respuestas (1)

Asvin Kumar
Asvin Kumar el 11 de Mayo de 2021
Have a look at the Classification, Prediction, and Forecasting section from this page on LSTMs. As the page explains, you broadly have two cases:
  1. When you have several input sequences each of same/varying length and you train your network on that.
  2. When you have one long input sequence and you train your network on a part of that to make multiple predictions.
The chicken pox example, that you mentioned, falls under category 2. The Japanese vowel classification and Engine RUL Prediction examples would fall under category 1.
For our chicken pox example, we make multiple predictions on a long sequence of data. Every prediction updates the cell state and hidden state of the network. The hidden state is also the output to the next layer. At each step, the networks take 1 time step as the input and predicts a 200 length vector as the output. This 200 is determined by the 'NumHiddenUnits' property of the lstmLayer. That's why you see that in the example's code, they predict over all the training data before starting prediction on the test data. By doing that, the networks cell and hidden states are ready for prediction on the test input.
Does it takes into account numerous inputs, or only the corresponding value into the XTrain set?
Technically, it takes in only 1 input. But, the output is influenced by the cell state and the hidden state from the previous time step along with the current input. So, the current step's output depends on inputs from all the time steps before it even though you only pass in the input for the current time step.
Does the size of the XTest value has an impact on the prediction?
The size of the XTest value is fixed when you configure the sequenceInputLayer's 'inputSize' argument. In the chicken pox example, you don't need to explicitly pass any values from the past to predict the future. That information is learnt by the LSTM during training. You will only have to pass the 1 input for the current time step to get a prediction for the next time step.
It is, however, possible to design and train networks where you have to explicitly predict on past data. This would involve some preprocessing.

Categorías

Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by