How to do a production forecast with GA-NN hybrid in MATLAB

3 visualizaciones (últimos 30 días)
I currently have an undergraduate thesis which requires me to utilize a hybrid of GA and NN. I am supposed to use this to forecast future production values. I have gone through several forums and documentations and I don’t seem to be able to correctly implement the GA-NN hybrid.
I would like to use the GA to optimize the NN weight and bias. I got a neural network with the NN Time Series toolbox with 20 hidden neurons and 4 time delays to be a good network fit ( _ I used NAR and not NARX...hope thats ok? _ )
My data is given under three parameters with about 2100 timesteps ,I want to predict their future values with the NN and optimize the weights and biases with GA

Respuesta aceptada

Greg Heath
Greg Heath el 9 de Mayo de 2015
You have a single I = O = 3 dimensional series with N = 2100 timesteps. You have successfully designed a NARNET with NFD = 4 feedback delays and H = 20 hidden neurons. This results in
Nw = net.numWeightElements = (NFD*O+1)*H+(H+1)*O = 323
Now you wish to design a net using GA. Is that correct?
Are you able to obtain acceptable designs with H < 20 ? ... The fewer, the better.
The NNToolbox does not have a genetic algorithm. However, there are posts that deal with GA designs for a non-timeseries net. Try searching the NEWSGROUP and ANSWERS using
neural genetic
or
neural ga
Hope this helps.
Greg
  3 comentarios
Greg Heath
Greg Heath el 11 de Mayo de 2015
Editada: Greg Heath el 11 de Mayo de 2015
TO DUPLICATE PREVIOUS RUNS YOU HAVE TO INITIALIZE THE RNG TO THE SAME INITIAL VALUE. FOR EXAMPLES SEARCH THE NEWSGROUP AND ANSWERS USING ONE OR MORE OF
GREG RNG(0)
GREG RNG(4151941)
GREG RNG('DEFAULT')
INSTEAD OF THE HISTOGRAM, I JUST LOOK AT THE NORMALIZED MEAN-SQUARE-ERROR AND ASSOCIATED RSQUARE (Fraction of target variance modeled by the net)
NMSE = mse(target-output)/mean(var(target',1))
R2 = 1-NMSE
Hope this helps
Greg
Olumide Oladoyin
Olumide Oladoyin el 12 de Mayo de 2015
Ok... Sounds good. Here is what my NN script looks like
% PInput - feedback time series.
T = tonndata(PInput,false,false);
% Select Training Function
trainFcn = 'trainlm'; % Levenberg-Marquardt
% Create a Nonlinear Autoregressive Network
feedbackDelays = 1:3;
hiddenLayerSize = 9;
net = narnet(feedbackDelays,hiddenLayerSize,'open',trainFcn);
% Choose Feedback Pre/Post-Processing Functions
% Settings for feedback input are automatically applied to feedback output
% For a list of all processing functions type: help nnprocess
net.input.processFcns = {'removeconstantrows','mapminmax'};
% Prepare the Data for Training and Simulation
% The function PREPARETS prepares timeseries data for a particular network,
% shifting time by the minimum amount to fill input states and layer states.
% Using PREPARETS allows you to keep your original time series data unchanged, while
% easily customizing it for networks with differing numbers of delays, with
% open loop or closed loop feedback modes.
[x,xi,ai,t] = preparets(net,{},{},T);
% Setup Division of Data for Training, Validation, Testing
% For a list of all data division functions type: help nndivide
net.divideFcn = 'divideblock'; % Divide data randomly
net.divideMode = 'time'; % Divide up every value
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
% Choose a Performance Function
net.performFcn = 'mse'; % Mean squared error
% Choose Plot Functions
net.plotFcns = {'plotperform','plottrainstate','plotresponse', ...
'ploterrcorr', 'plotinerrcorr'};
% Train the Network
[net,tr] = train(net,x,t,xi,ai);
% Test the Network
y = net(x,xi,ai);
e = gsubtract(t,y);
performance = perform(net,t,y)
% Recalculate Training, Validation and Test Performance
trainTargets = gmultiply(t,tr.trainMask);
valTargets = gmultiply(t,tr.valMask);
testTargets = gmultiply(t,tr.testMask);
trainPerformance = perform(net,trainTargets,y)
valPerformance = perform(net,valTargets,y)
testPerformance = perform(net,testTargets,y)
% View the Network
% view(net)
NMSE = mse(t-output)/mean(var(t',1))
R2 = 1-NMSE
% Plots
% Uncomment these lines to enable various plots.
%figure, plotperform(tr)
%figure, plottrainstate(tr)
%figure, plotresponse(t,y)
%figure, ploterrcorr(e)
%figure, plotinerrcorr(x,e)
% Closed Loop Network
% Use this network to do multi-step prediction.
% The function CLOSELOOP replaces the feedback input with a direct
% connection from the outout layer.
netc = closeloop(net);
[xc,xic,aic,tc] = preparets(netc,{},{},T);
yc = netc(xc,xic,aic);
perfc = perform(net,tc,yc)
% Multi-step Prediction
% Sometimes it is useful to simulate a network in open-loop form for as
% long as there is known data T, and then switch to closed-loop to perform
% multistep prediction. Here The open-loop network is simulated on the known
% output series, then the network and its final delay states are converted
% to closed-loop form to produce predictions for 100 more timesteps.
[x1,xio,aio,t] = preparets(net,{},{},T);
[y1,xfo,afo] = net(x1,xio,aio);
[netc,xic,aic] = closeloop(net,xfo,afo);
[y2,xfc,afc] = netc(cell(0,100),xic,aic);
% Further predictions can be made by continuing simulation starting with
% the final input and layer delay states, xfc and afc.
% Step-Ahead Prediction Network
% For some applications it helps to get the prediction a timestep early.
% The original network returns predicted y(t+1) at the same time it is given y(t+1).
% For some applications such as decision making, it would help to have predicted
% y(t+1) once y(t) is available, but before the actual y(t+1) occurs.
% The network can be made to return its output a timestep early by removing one delay
% so that its minimal tap delay is now 0 instead of 1. The new network returns the
% same outputs as the original network, but outputs are shifted left one timestep.
nets = removedelay(net);
[xs,xis,ais,ts] = preparets(nets,{},{},T);
ys = nets(xs,xis,ais);
stepAheadPerformance = perform(net,ts,ys)
%plotperform(ts)
Any sugestions to improve the neural network? and
Could you please point me towards getting future values up to about 1000 timesteps ahead?
Thanks in antincipation

Iniciar sesión para comentar.

Más respuestas (0)

Categorías

Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by