fminunc in an endless loop
Mostrar comentarios más antiguos
Hi all,
I would really appreciate some help. I have written a neural network model and tried to keep it as flexible as possible. The model works perfectly (tested with test cases from different online courses) for a binary classification with and without the optimization function fminunc. It also works (and is tested) for a multi-class classification if I do NOT use fminunc. However, if I implement fminunc, the cost, which I try to minimize, stays put at the first value and the whole program runs into an endless loop. I have also tried to feed fminunc in addition to the parameters (the sought after output) with the flattened gradient - but this does not help either. The cost stays at the initial value.
Any ideas what I am doing wrong?
Here is my code:
%
% Reshape data so that examples are in columns, features in rows
X = X';
y = double( one_hot(y'));
%
% Set up layer dimenstions
layer_dims = [400,25,10];
activ_hidden = 'sigmoid'; % Sigmoid used in test case
activ_out = 'sigmoid'; % Sigmoid used in test case
maxIter = 50; % 50 used in test case
lambda = 0; % No L2-regularization in test case
keep_prob = 1; % No dropout-regularization in test case
learning_rate = 1; % Required model argument but not needed in fminunc, hence set to 1
%
% Set up initial parameters with seed from test case
parameters = rand_u_init(layer_dims);
%
% Reshape parameters
parameters_flat = flatten_params(parameters, layer_dims);
%
% Set up options for fminunc
options = optimset('MaxIter', maxIter);
%
% Create short hand for the cost function to be minimized
cost_function = @(p) L_layer_multi_optimized_model_for_testing(...
p, ...
X,...
y, ...
layer_dims, ...
activ_hidden, ...
activ_out, ...
lambda, ...
keep_prob, ...
learning_rate);
%
% Run optimization algorithm fminunc
[parameters_flat, cost] = ...
fminunc(cost_function, parameters_flat, options);
%
% Reshape parameters
parameters = reshape_params(parameters_flat, layer_dims);
As I say, the model runs perfectly well without fminunc - and I can not find the mistake. Any help is really appreciated,
thanks a lot in advance, cheers Wolfgang
Respuestas (1)
Alan Weiss
el 27 de Oct. de 2017
0 votos
While I do not really understand what you are trying to do, perhaps you need to set fminunc options to use larger finite differences than the defaults.
Alan Weiss
MATLAB mathematical toolbox documentation
1 comentario
Wolfgang Reuter
el 28 de Oct. de 2017
Categorías
Más información sobre Deep Learning Toolbox en Centro de ayuda y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!