Which activation function is used by the Matlab Convolutional Neural Network Toolbox for the Fully-Connected-Layer?
4 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
nanocorex
el 21 de Abr. de 2017
Respondida: Taylor Smith
el 27 de Jun. de 2017
especially if it is essentially a Multi-Layer-Perceptron which consists of multiple hidden layers connected to a Softmax-Layer.
0 comentarios
Respuesta aceptada
Greg Heath
el 23 de Abr. de 2017
The MATLAB default converts inputs and targets to the [ -1, 1] range. However, this does not help in detecting outliers that should be modified or removed.
In order to detect outliers and prevent saturating hidden nodes, I generally advise converting inputs (and non-classifier targets) to zero- mean/unit-variance and using tansig (i.e., tanh) hidden nodes.
Hope this helps
Thank you for formally accepting my answer
Greg
0 comentarios
Más respuestas (1)
Taylor Smith
el 27 de Jun. de 2017
I have this question as well, however I need to actually know what activation/transfer function is used and ideally where it might be located in source code. Following execution of a simple network line-by-line, I see how the fully connected layer multiplies the input by the appropriate weights and adds the bias, however as far as I can see this is all that is done to calculate the activations for the fully connected layer. The weighted and biased inputs don't seem to be fed to any transfer function.
Any clues?
0 comentarios
Ver también
Categorías
Más información sobre Deep Learning Toolbox en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!