hyperparameter tuning Neural network

3 visualizaciones (últimos 30 días)
Ali
Ali el 18 de Oct. de 2019
Respondida: Sai Bhargav Avula el 23 de Oct. de 2019
How to tune below parameters in neural network?
training_function = {'traingd' 'traingda' 'traingdm' 'traingdx'}
optimizers= {'SGD', 'RMSprop', 'Adam'}
activation_functions= {'ReLU','Dropout'};
Transfer_functions= {'tansig,'tanh'};
I am trying byesian optimization 'bayesopt()' using ''optimizableVariable''. But it didnt work for me. How to define above parameters for "Vars" variable?
% Define a train/validation split to use inside the objective function
cv = cvpartition(numel(YTrain), 'Holdout', 1/3);
% Define hyperparameters to optimize
vars = [optimizableVariable('hiddenLayerSize', [1,20], 'Type', 'integer');
optimizableVariable('lr', [1e-3 1], 'Transform', 'log')];
% Optimize
minfn = @(T)kfoldLoss(XTrain', YTrain', cv, T.hiddenLayerSize, T.lr);
results = bayesopt(minfn, vars,'IsObjectiveDeterministic', false,...
'AcquisitionFunctionName', 'expected-improvement-plus');
T = bestPoint(results)

Respuestas (1)

Sai Bhargav Avula
Sai Bhargav Avula el 23 de Oct. de 2019
Hi,
You cannot directly optimize for the parameters you mentioned using Bayesian optimization.
A possible work around would be defining a custom optimizing function that the given parameters as input and solving them sequentially.
For example
function rmse = optimizerLoss(x,y,cv,numHid,optimizer,lr)
% Train net.
net = feedforwardnet(numHid, optimizer);
net.trainParam.lr = lr;
net = train(net, x(:,cv.training), y(:,cv.training));
% Evaluate on validation set and compute rmse
ypred = net(x(:, cv.test));
rmse = sqrt(mean((ypred - y(cv.test)).^2));
end
Use the optimizers the optimizers you mentioned sequentially. And finally DropOut is not a activation.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by