Why MATLAB uses the Newton’s Gradient Descent as the solver for fitcnet?
7 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Memo Remo
el 28 de En. de 2023
Comentada: Memo Remo
el 23 de Feb. de 2023
I want to train a neural network to perform image segmentation using fitcnet. I noticed that MATLAB uses the Newton’s Gradient Descent solver (LBFGS) to perform parameters optimization. Does anyone know why MATLAB selected this method over other optimization methods such as Adam or SGDM?
I would appreciate any help you could provide.
0 comentarios
Respuesta aceptada
Lucas García
el 20 de Feb. de 2023
L-BFGS is used both in in fitcnet and fitrnet. These functions ship in Statistics and Machine Learning Toolbox and allow you to get started in solving Machine Learning problems using Neural Networks.
For more customization capabilities, more advanced architectures and additional solver methods, you may use Deep Learning Toolbox. This includes the use of solvers such as SGDM, RMSProp or Adam. See trainingOptions for more details.
3 comentarios
Lucas García
el 20 de Feb. de 2023
Good point Walter. I wanted to highlight that the solver available in Statistics and Machine Learning Toolbox is more appropriate for shallow networks (i.e. MLPs). L-BFGS has great convergence properties for small networks using smaller datasets.
For more complex networks, such as the one highlighted in the question (image segmentation), Deep Learning Toolbox does provide solvers such as SGDM and Adam (more appropriate for deeper network architectures).
It might be worth checking one of the Computer Vision Toolbox examples leveraging Deep Learning Toolbox: https://www.mathworks.com/help/vision/ug/semantic-segmentation-using-deep-learning.html
Más respuestas (0)
Ver también
Categorías
Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!