Neural Network Classification: cost function, regularization parameter, availability of the hidden layers...
3 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Richard Palmer
el 27 de Mayo de 2015
Comentada: Richard Palmer
el 28 de Mayo de 2015
How do I derive the cost function J from a nn classification model?
How can I set a regularization parameter? Is this approached using your weights capability?
Are the hidden layer values available after modeling?
0 comentarios
Respuesta aceptada
Greg Heath
el 28 de Mayo de 2015
You cannot derive a cost function from a model.
You specify the cost function for a design.
If you are new at this, start out by accepting the NNToolbox defaults. First check out the relatively simple examples in the classification documentation
help patternnet
doc patternnet
Then search the NEWSGROUP and ANSWERS for posted examples. For example
greg patternnet
The default transfer functions are tansig (hidden layer) and softmax (output layer)
The "cost" function is crossentropy
Many questions on terminology and details can be obtained using the commands help, doc and type, e.g.,
help patternnet
doc patternnet
type patternnet
Hope this helps.
Thank you for formally accepting my answer
Greg
1 comentario
Más respuestas (0)
Ver también
Categorías
Más información sobre Image Data Workflows en Help Center y File Exchange.
Productos
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!