Is there any gradient descent method available?

186 visualizaciones (últimos 30 días)
Jan Valdman
Jan Valdman el 15 de Oct. de 2018
Comentada: Alan Weiss el 24 de Nov. de 2021
We are working on the optimization of nonconvex energies in mechanics of solid (see the attached picture) resulting from the finite element discretization with a moderate number of variables (up to several thousands).
At the moment I am using the function fminunc. It typically converges very slow or not at all (for more variables) taking many iterations:
First-order
Iteration Func-count f(x) Step-size optimality
0 1781 0.4 0.00625
1 5343 0.39987 0.168734 0.03
2 7124 0.399645 1 0.0235
3 8905 0.398947 1 0.0616
4 10686 0.398602 1 0.0458
5 12467 0.398094 1 0.0459
6 14248 0.397673 1 0.059
7 16029 0.397249 1 0.0415
8 17810 0.396635 1 0.0404
9 19591 0.396317 1 0.0847
10 21372 0.395881 1 0.0523
some iteration between and
First-order
Iteration Func-count f(x) Step-size optimality
60 110422 0.310397 1 0.128
61 112203 0.307966 1 0.121
62 113984 0.305174 1 0.235
63 117546 0.303857 0.241185 0.128
64 121108 0.302965 0.1 0.118
65 124670 0.301738 0.1 0.0921
66 229749 0.301464 0.0188263 0.181 Local minimum possible.fminunc stopped because it cannot decrease the objective function
along the current search direction.
We have no information on the gradient (the subgradient computation is possible, but theoretically demanding to derive, there are some nondifferentiable terms etc.) and consequently no information on the second gradient (both are evaluated numerically) and applied the setup
options = optimoptions('fminunc','Algorithm','quasi-newton','Display','iter');
We are thinking with my colleague, the quasi-newton method is a bit overkill for our problem, and we would like to run only few iterations of the gradient descent (possibly with a damping) to get a picture what happens around our initial energy.
It there any implementation of the method able to evaluate the numerical gradient and maybe compare it with our theoretically derived gradient later? It would be perfect, if fminunc had this option but I could not find it.
Thank you, Jan Valdman

Respuesta aceptada

Alan Weiss
Alan Weiss el 15 de Oct. de 2018
Indeed, fminunc has a mainly-undocumented gradient descent feature that you can see demonstrated in this example. Usually, gradient descent does not work very well, but I suppose that you already know that.
To check whether the internally-calculated gradients in fminunc match a gradient function at the initial point you can use the CheckGradients option. If you want to get a numerical approximation to your gradients you can use John D'Errico's file exchange contribution Adaptive Robust Numerical Differentiation, though on second thought this might not be exactly suited to your problem.
Good luck,
Alan Weiss
MATLAB mathematical toolbox documentation
  2 comentarios
Jan Valdman
Jan Valdman el 23 de Nov. de 2021
Dear Alan,
thank you for your tips, we published our 2D computations as Global injectivity in second-gradient nonlinear elasticity and its approximation with penalty terms - Stefan Krömer, Jan Valdman, 2019 (sagepub.com). There was some initial hope for the trust region method, but we left it, since gradients evaluations is very technical (there are nonlocal part of our function). So we stick to the quasi-Newton method instead and did not profit from a steepest descent, however it might be tested later as well.
Some new pictures (one is attached) are promising in our recent 3D computations, but the method still needs too many iterations. The difficult part of the function to be minimized only becomes active for later iterations and we can in fact easily derive a full Hessian for the initial iteration. So I am wondering whether we can provide an initial Hessian matrix in the BFGS update to replace a standard choice of the identity matrix. We expect it should reduce the number of iterations. Is it possible?
Thank you, Jan Valdman
Alan Weiss
Alan Weiss el 24 de Nov. de 2021
Indeed, if you can provide a gradient and Hessian in general (not just at the initial point), then you can expect your optimization to proceed more quickly and in fewer iterations. For an example, see https://www.mathworks.com/help/optim/ug/symbolic-math-toolbox-calculates-gradients-and-hessians.html#SymbolicMathToolboxCalculatesGradientsAndHessiansExample-10
But if you can provide the Hessian only at the initial point, then I do not know how you can incorporate the information into the solver. Sorry.
Alan Weiss
MATLAB mathematical toolbox documentation

Iniciar sesión para comentar.

Más respuestas (0)

Categorías

Más información sobre Solver Outputs and Iterative Display en Help Center y File Exchange.

Productos


Versión

R2014b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by