How to avoid getting negative values when training a neural network?
Mostrar comentarios más antiguos
Is there anyway to constrain the network results when we train a feed forward neural network in Matlab?
I am trying to train a supervised feed forward neural network with 100,000 observations. I have 5 continues variables and 3 countinues responses (labels). All my values are positive (labels and variables). However, when I train the network, sometimes it predicts negative results no matter what architecture I use. Negative results does not have any physical meaning and should not apear. Is there anyway to constrain the network? I also used reLU activation function for the last layer but the network cannot generalize well.
Thanks
Respuesta aceptada
Más respuestas (1)
Greg Heath
el 18 de En. de 2020
0 votos
Use a sigmoid for the output layer.
Hope this helps
THANK YOU FOR FORMALLY ACCEPTING MY ANSWER
GREG
1 comentario
Mostafa Nakhaei
el 18 de En. de 2020
Categorías
Más información sobre Deep Learning Toolbox en Centro de ayuda y File Exchange.
Productos
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!