Out of Memory errors in MATLAB NN Toolbox and data division problems in training datasets.
1 visualización (últimos 30 días)
Mostrar comentarios más antiguos
Hello:
I am facing problem in running huge data set in matlab NN Toolbox- the problem is-> when i use trainlm algorithm, NN Toolbox fails to run the data and shows Out of memory error, but for other algorithms there is no memory problem. Why is this so? Moreover when i put hidden neuron more than 15 it also shows out of memory. How to solve this kind of problems?
One more thing: i put 10, 45, 45 % data division for training -validation and testing, but after running the codes i found that in the workspace it executed 25% data for training, 37% data for validation, and 37% data for testing purpose. How to resolve this issue?
Do anybody have idea how to solve this kind of problems? I will be glad to have the comments and any kind of suggestion. Thanks.
Here is the code i used for training the dataset
EX_355 = xlsread('Training Dataset.xlsx','B2:B435106');
EX_532 = xlsread('Training Dataset.xlsx','C2:C435106');
BA_355 = xlsread('Training Dataset.xlsx','D2:D435106');
BA_532 = xlsread('Training Dataset.xlsx','E2:E435106');
BA_1064 = xlsread('Training Dataset.xlsx','F2:F435106');
Reff = xlsread('Training Dataset.xlsx','G2:G435106');
Input(1,:) = EX_355;
Input(2,:) = EX_532;
Input(3,:) = BA_355;
Input(4,:) = BA_532;
Input(5,:) = BA_1064;
Target(1,:) = Reff;
net = feedforwardnet;
net = configure(net,Input,Target);
net = init(net);
inputs = Input;
targets = Target;
hiddenLayerSize = 10;
net = fitnet(hiddenLayerSize);
net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'};
net.outputs{2}.processFcns = {'removeconstantrows','mapminmax'};
net.divideFcn = 'dividerand';
net.divideMode = 'sample';
net.divideParam.trainRatio = 10/100;
net.divideParam.valRatio = 45/100;
net.divideParam.testRatio = 45/100;
net.trainFcn = 'trainlm';
net.performFcn = 'mse';
net.plotFcns = {'plotperform','plottrainstate','ploterrhist', ... 'plotregression', 'plotfit'};
[net,tr] = train(net,inputs,targets);
outputs = net(inputs);
errors = gsubtract(targets,outputs);
performance = perform(net,targets,outputs)
trainTargets = targets .* tr.trainMask{1};
valTargets = targets .* tr.valMask{1};
testTargets = targets .* tr.testMask{1};
net.trainParam.epochs;
net.trainParam.time;
net.trainParam.goal;
net.trainParam.min_grad;
net.trainParam.mu_max;
net.trainParam.max_fail;
net.trainParam.show;
0 comentarios
Respuestas (0)
Ver también
Categorías
Más información sobre Define Shallow Neural Network Architectures en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!