How to compute gradients using the Neural Network Toolbox software?

3 visualizaciones (últimos 30 días)
Jungyeon Baek
Jungyeon Baek el 1 de Feb. de 2019
Comentada: Howard Lam el 26 de Sept. de 2019
I've been reading Neural_Network_Toolbox_Users_Guide and I have a question about below section.
As what it said in 3-19,
In fact, the gradients and Jacobians for any network that has differentiable transfer functions, weight functions and net input functions can be computed using the Neural Network Toolbox software through a backpropagation process. You can even create your own custom networks and then train them using any of the training functions in the table above. The gradients and Jacobians will be automatically computed for you.
Could you explain this part in detail how we can only get the gradient of training functions?

Respuestas (1)

Greg Heath
Greg Heath el 2 de Feb. de 2019
help gradient
doc gradient
Thank you for formally accepting my answer
Greg
  1 comentario
Howard Lam
Howard Lam el 26 de Sept. de 2019
Gradient function computes only the numerical gradient which is just really the difference.
I am not sure if the activation functions in the neural network toolbox is symbolic.

Iniciar sesión para comentar.

Categorías

Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by