Ahora está siguiendo esta publicación
- Verá actualizaciones en las notificaciones de contenido en seguimiento.
- Podrá recibir correos electrónicos, en función de las preferencias de comunicación que haya establecido.
The optimization algorithms (Newton, Quasi-Newton as well as gradient descent methods) are implemented by considering objective functions in one and two dimensions. The Newton and quasi-Newton methods may encounter problems such as the Hessian is too complex or does not exist. The requirement to apply a matrix inversion at each iteration, this can be prohibitive for optimization problems involving many variables. These methods can therefore become impractical. An alternative is to use the family of gradient descent algorithms. These methods do not require explicit computation or Hessian approximation. A gradient descent algorithm is implemented by choosing successive descent directions and the amplitude of the descent step in the chosen direction. This family of algorithms is widely used in optimization processes of more or less complex problems. The term descent arises because these algorithms look for the extrema in an opposite direction to that of the objective function's gradient.
Explanatory algorithmic schemes are available in the user guide.
Citar como
Kenouche Samir (2026). The optimization algorithms (https://es.mathworks.com/matlabcentral/fileexchange/128008-the-optimization-algorithms), MATLAB Central File Exchange. Recuperado .
Información general
- Versión 04.2023.01 (143 KB)
Compatibilidad con la versión de MATLAB
- Compatible con cualquier versión
Compatibilidad con las plataformas
- Windows
- macOS
- Linux
| Versión | Publicado | Notas de la versión | Action |
|---|---|---|---|
| 04.2023.01 |
