Adding Dropout to narxnet

4 visualizaciones (últimos 30 días)
Andriy Artemyev
Andriy Artemyev el 23 de Mzo. de 2021
Comentada: Andriy Artemyev el 30 de Mzo. de 2021
Greetings! I wanted to ask if/how it is possible to add a dropout layer to a narxnet to improve regularization. Unfortunately I could not find any information elsewhere.
I Have a narxnet that used the last 3 lags of a timeseries and an exogenous input to forecast the next timestep and I would like to introduce regularization measures to help with overfitting. Thanks in advance! My current code looks as follow:
forecast_horizon = 1;
neurons = [5 5];
delays = 3;
inputDelays = (1:delays);
feedbackDelays = (1:delays);
net=narxnet(inputDelays, feedbackDelays, neurons);
net.trainFcn='trainbr';
net.trainParam.epochs = 40;
net = removedelay(net,forecast_horizon);

Respuesta aceptada

Shashank Gupta
Shashank Gupta el 29 de Mzo. de 2021
Hi Andriy,
There are some workaround to add a dropout in narxnet, you can add the dropout by defining a custom transfer function to one of the layer. Details on how to create a custom transfer function is show here in the link.
Another convenient way is to not use a shallow network but go for deep networks. There are some resources you can check, try this, once you implement this, you can simply add a dropout layer. I will prefer deep network way. It's more easy, reliable and conveninet to implement.
I hope this helps.
Cheers.

Más respuestas (0)

Productos


Versión

R2021a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by