Issue with large memory required for non-linear optimizer
3 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Yannis Stamatiou
el 18 de Abr. de 2023
Comentada: Yannis Stamatiou
el 20 de Abr. de 2023
Dear Matlab community hi.
I tried to run the following optimization problem for a 2-dimensional optimization variable of size 150x150. For some reason, the system creates somehow in the optimization process (I guess) some matrix of size (150^2)x(150^2). I tried to solve the issue for several days now (with the different options shown in comments) but I cannot understand why MATLAB creates such a huge matrix in the solution process. Is there, perhaps, some other nonlinear optimizer in MATLAB that does not require such huge matrices? Any help on this issue would be very helpful.
With best wishes,
Yannis
a = 4;
b = 2.1;
c = 4;
x = optimvar('x',150,150);
prob = optimproblem;
prob.Objective = parameterfun(x,a,b,c);
%opts=optimoptions('fmincon','Algorithm','interior-point','SpecifyObjectiveGradient',true,'HessianFcn','objective');
%opts=optimoptions('quadprog','Algorithm','trust-region-reflective','Display','off');
opts = optimoptions('fminunc','Algorithm','trust-region');
opts.HessianApproximation = 'lbfgs';
opts.SpecifyObjectiveGradient = false;
x0.x = 0.5 * ones([150,150]);
%[sol,qfval,qexitflag,qoutput] = solve(prob,x0,'options',opts);
[sol,fval] = solve(prob,x0)
3 comentarios
Torsten
el 20 de Abr. de 2023
Movida: John D'Errico
el 20 de Abr. de 2023
And what did you decide to do ?
Respuesta aceptada
Alan Weiss
el 19 de Abr. de 2023
You have 150^2 optimization variables. I do not see your parameterfun function, but if it is not a supported function for automatic differentiation, then fminunc cannot use the 'trust-region' algorithm because that algorithm requires a gradient function. The LBFGS Hessian approximation is not supported in the 'quasi-newton' algorithm. Sorry.
Alan Weiss
MATLAB mathematical toolbox documentation
6 comentarios
Bruno Luong
el 20 de Abr. de 2023
@Yannis Stamatiou " I cannot figure out exactly why"
The lbfgs formula approximate the inverse of the Hessian by low-rank approximation and does not require to store the full Hessian or its inverse.
That's why the memory requirement is reduced and it is suitable for lare-scale problem.
Más respuestas (0)
Ver también
Categorías
Más información sobre Surrogate Optimization en Help Center y File Exchange.
Productos
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!