Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

Neural Network - Multi Step Ahead Prediction

Asked by Jack on 2 Sep 2011
Latest activity Commented on by Constantine on 21 Nov 2014

Hi all, please I need your help !

I've read all the posts here about Time Series Forecasting but still can't figure it out ! I'm drained.. :-(

I've a NARX neural network with 10 hidden neurons and 2 delays. As input I have a 510x5 (called Inputx) and as output I have a 510x1 (called Target).

I want to forecast 10 days ahead but it's really not working...

I tried the following code but I'm stuck now. :-(

Would you mind to help me ? Some code will be awesome. :-(

***////////////////////////////////////////////******** ***/////////////////////////////////////////// ******

inputSeries = tonndata(Inputx,false,false);
targetSeries = tonndata(Target,false,false);
netc = closeloop(net);
netc.name = [net.name ' - Closed Loop'];
[xc,xic,aic,tc] = preparets(netc,inputSeries,{},targetSeries);
yc = netc(xc,xic,aic);

***////////////////////////////////////////////******** ***/////////////////////////////////////////// ******

2 Comments

Oleg Komarov on 2 Sep 2011

Two things,please change the title of your post in something useful and format the code: http://www.mathworks.com/matlabcentral/answers/13205-tutorial-how-to-format-your-question-with-markup#answer_18099

Constantine on 21 Nov 2014

with respect to the accepted answer by Lucas Garcia, I find the predicted data only agrees with the actual data as well as his every once in a while.

1. It's important, before running the fit, to clear the variables, eg. 'clear all.' Re-running without clearing the variables leads to much worse fits.

2. much better fits result from using a bigger delay, like 5, instead of the delay of 2 in his example. Or by adding additional training data, such as the time derivative or 2nd time derivatives of the training data. Of course, doing this makes the fit considerably slower.

Jack

4 Answers

Answer by Lucas García on 7 Sep 2011
Accepted answer

Hi Jack,

When using narxnet, the network performs only a one-step ahead prediction after it has been trained. Therefore, you need to use closeloop to perform a multi-step-ahead prediction and turn the network into parallel configuration.

Take a look at this example for a multi-step-ahead prediction, N steps. This uses the dataset magdata.mat which is available in the Neural Network Toolbox. Also, some of the inputs will be used for performing the multi-step-ahead prediction, and results validated with the original data. I hope the comments help to understand.

%% 1. Importing data

S = load('magdata');
X = con2seq(S.u);
T = con2seq(S.y);

%% 2. Data preparation

N = 300; % Multi-step ahead prediction
% Input and target series are divided in two groups of data:
% 1st group: used to train the network
inputSeries  = X(1:end-N);
targetSeries = T(1:end-N);
% 2nd group: this is the new data used for simulation. inputSeriesVal will 
% be used for predicting new targets. targetSeriesVal will be used for
% network validation after prediction
inputSeriesVal  = X(end-N+1:end);
targetSeriesVal = T(end-N+1:end); % This is generally not available

%% 3. Network Architecture

delay = 2;
neuronsHiddenLayer = 50;
% Network Creation
net = narxnet(1:delay,1:delay,neuronsHiddenLayer);

%% 4. Training the network

[Xs,Xi,Ai,Ts] = preparets(net,inputSeries,{},targetSeries); 
net = train(net,Xs,Ts,Xi,Ai);
view(net)
Y = net(Xs,Xi,Ai); 
% Performance for the series-parallel implementation, only 
% one-step-ahead prediction
perf = perform(net,Ts,Y);

%% 5. Multi-step ahead prediction

inputSeriesPred  = [inputSeries(end-delay+1:end),inputSeriesVal];
targetSeriesPred = [targetSeries(end-delay+1:end), con2seq(nan(1,N))];
netc = closeloop(net);
view(netc)
[Xs,Xi,Ai,Ts] = preparets(netc,inputSeriesPred,{},targetSeriesPred);
yPred = netc(Xs,Xi,Ai);
perf = perform(net,yPred,targetSeriesVal);
figure;
plot([cell2mat(targetSeries),nan(1,N);
      nan(1,length(targetSeries)),cell2mat(yPred);
      nan(1,length(targetSeries)),cell2mat(targetSeriesVal)]')
legend('Original Targets','Network Predictions','Expected Outputs')

9 Comments

Nilanjan on 25 Apr 2013

Hi, i am using NARX to predict a daily stock market index data (Sensex 2003x1 matrix) as target and another daily stock market data(Nifty) as input. I have done it using the example you have shown in:

http://www.mathworks.in/matlabcentral/answers/14970

The code:

%%%newNARX code 24/4/2013
%% 1. Importing data
% Matrix of 2003x1 each are
% daily stock market indices data
% of Nifty & Sensex
load Nifty.dat;
load Sensex.dat;
% %%S = load('magdata');
% %%X = con2seq(S.u);
% %%T = con2seq(S.y);
% To scale the data it is converted to its log value:
  lognifty = log(Nifty);
  logsensex = log(Sensex);
    X = tonndata(lognifty,false,false);
    T = tonndata(logsensex,false,false);
  %  X = con2seq(x);
  %  T = con2seq(t);
%% 2. Data preparation
N = 300; % Multi-step ahead prediction
% Input and target series are divided in two groups of data:
% 1st group: used to train the network
inputSeries  = X(1:end-N);
targetSeries = T(1:end-N);
% 2nd group: this is the new data used for simulation. inputSeriesVal will 
% be used for predicting new targets. targetSeriesVal will be used for
% network validation after prediction
inputSeriesVal  = X(end-N+1:end);
targetSeriesVal = T(end-N+1:end); % This is generally not available
%% 3. Network Architecture
delay = 2;
neuronsHiddenLayer = 50;
% Network Creation
net = narxnet(1:delay,1:delay,neuronsHiddenLayer);
%% 4. Training the network
[Xs,Xi,Ai,Ts] = preparets(net,inputSeries,{},targetSeries); 
net = train(net,Xs,Ts,Xi,Ai);
view(net)
Y = net(Xs,Xi,Ai); 
% Performance for the series-parallel implementation, only 
% one-step-ahead prediction
perf = perform(net,Ts,Y);
%% 5. Multi-step ahead prediction
inputSeriesPred  = [inputSeries(end-delay+1:end),inputSeriesVal];
targetSeriesPred = [targetSeries(end-delay+1:end), con2seq(nan(1,N))];
netc = closeloop(net);
view(netc)
[Xs,Xi,Ai,Ts] = preparets(netc,inputSeriesPred,{},targetSeriesPred);
yPred = netc(Xs,Xi,Ai);
perf = perform(net,yPred,targetSeriesVal);
figure;
plot([cell2mat(targetSeries),nan(1,N);
      nan(1,length(targetSeries)),cell2mat(yPred);
      nan(1,length(targetSeries)),cell2mat(targetSeriesVal)]')
legend('Original Targets','Network Predictions','Expected Outputs')

Network predictions are coming very bad.. I guess there is some problem with the close loop's initial input states and initial layer states. please help.

phuong on 28 Aug 2013

Please i don't understand some idea about your answer of Lucas García. Anyone know please show me. First, code of Lucas that use to predict N values in future with one step: predict value at t+1 by past values at t-1, t-2,...,t-d. Is it right? Second, why we don't use a loop as "for" with i from 1 to N. and each i, we will predict value at t+i by past values at t+i-1,...t+i-d. I think It's more accuracy than first way.

Greg Heath on 30 Aug 2013

No need for a loop. The input and targets are time series, not single points from time series. Each step is performed automatically via sim or net.

When the loop is called,there is only the input series. The target series is replaced by output feedback.

Greg

Lucas García
Answer by Mark Hudson Beale on 9 Sep 2011

Here is an example that may help. A NARX network is trained on series inputs X and targets T, then the simulation is picked up at the end of X using continuation input data X2 with a closed loop network. The final states after open loop simulation with X are used as the initial states for closed loop simulation with X2.

% DESIGN NETWORK
[x,t] = simplenarx_dataset;
net = narxnet;
[X,Xi,Ai,T] = preparets(net,x,{},t);
net = train(net,X,T,Xi,Ai);
view(net)
% SIMULATE NETWORK FOR ORIGINAL SERIES
[Y,Xf,Af] = sim(net,X,Xi,Ai);
% CONTINUE SIMULATION FROM FINAL STATES XF & AF WITH ADDITIONAL
% INPUT DATA USING CLOSED LOOP NETWORK.
% Closed Loop Network
netc = closeloop(net);
view(netc)
% 10 More Steps for the first (now only) input
X2 = num2cell(rand(1,10));
% Initial input states for closed loop continuation will be the
% first input's final states.
Xi2 = Xf(1,:);
% Initial 2nd layer states for closed loop contination will be the
% processed second input's final states.  Initial 1st layer states
% will be zeros, as they have no delays associated with them.
Ai2 = cell2mat(Xf(2,:));
for i=1:length(net.inputs{1}.processFcns)
  fcn = net.inputs{i}.processFcns{i};
  settings = net.inputs{i}.processSettings{i};
  Ai2 = feval(fcn,'apply',Ai2,settings);
end
Ai2 = mat2cell([zeros(10,2); Ai2],[10 1],ones(1,2));
% Closed loop simulation on X2 continues from open loop state after X.
Y2 = sim(netc,X2,Xi2,Ai2);

2 Comments

Jack on 12 Sep 2011

Thank you very much Mark for your answer ! :-))

Elma on 18 Feb 2014

I have tried this code, and it is great, but when I try to apply it for my problem, I get really bad results. I tried with changing input and feedback delays, as well as number of hidden neurons, but the results are always bad (figure) (green line is multi step predistion)

The code is given below:

% DESIGN NETWORK
ID=1:2;
HL=6
FD=1:2;
net = narxnet(ID,FD,HL);
[X,Xi,Ai,T] = preparets(net,x,{},WS);
net.divideFcn = 'divideblock';  
net = train(net,X,T,Xi,Ai);
% SIMULATE NETWORK FOR ORIGINAL SERIES
[Y,Xf,Af] = sim(net,X,Xi,Ai);
% CONTINUE SIMULATION FROM FINAL STATES XF & AF WITH ADDITIONAL
% INPUT DATA USING CLOSED LOOP NETWORK.
% Closed Loop Network
netc = closeloop(net);
Xi2 = Xf(1,:);
Ai2 = cell2mat(Xf(2,:));
for i=1:length(net.inputs{1}.processFcns)
  fcn = net.inputs{i}.processFcns{i};
  settings = net.inputs{i}.processSettings{i};
  Ai2 = feval(fcn,'apply',Ai2,settings);
end
Ai2 = mat2cell([zeros(10,2); Ai2],[10 1],ones(1,2));
Y2 = sim(netc,X2,Xi2,Ai2);
plot(1:length(WS),cell2mat(WS))
hold on
plot(1:length(Y),cell2mat(Y),'r')
plot(length(WS):length(WS)+length(Y2)-1,cell2mat(Y2),'g')
legend('Input data - target series','One-step ahead prediction','Multi-step prediction beyond target series');
Mark Hudson Beale
Answer by Greg Heath on 25 Mar 2014

When the loop is closed, the net should be retrained with the original data and initial weights the same as the final weights of the openloop configuration.

0 Comments

Greg Heath
Answer by mladen on 25 Oct 2013

Be aware that predicting outputs this way (similar to cascade relaization of linear system) has great sensitivity to parametar estimation errors because they propagate in the process Mark Hudson Beale mentioned. This is highlighted in hard, multiple steps ahead problems.

Parallel realizations (simoltanoius output estimation...for instance 10 outputs of neural network for next 10 time steps) tend to be less sensitive to this errors. I have implemented this with my code which is alway prone to error :) So my subquestion is:

Is there some specific way to prepare my data for training with some matlab function?

0 Comments

mladen

Contact us