Back propagation neural network
6 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
I learnt that the output of activation functions, logsig and tansig, returns the value between [0 1] and [-1 1] respectively. What will happen if the target values are beyond these limits?
2 comentarios
Mohammad Sami
el 8 de Jun. de 2020
One of the reasons is that larger values can result in a problem of exploding gradients, when training the network.
Respuestas (0)
Ver también
Categorías
Más información sobre Define Shallow Neural Network Architectures en Help Center y File Exchange.
Productos
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!