Out of memory issue while training a Neural Network (NN), array exceeds maximum array size preference using backpropJacobianStatic
Mostrar comentarios más antiguos
Hello, this is my first time asking a question here, I will try to be brief and clear !
I am currently trying to train a NN of 2 hidden layers with 256 neurones both, in input and output i have a 22*size(trainSet) data set. This represent an amout of 77334 weigth + biais, this shouldn't be a problem to train a NN like that since I saw on some post people training much larger NN. But the issue is that when I call the train function somewhere inside the matlab code (in the backpropJacobianStatic) there is a matrix multiplication that create an array of size 77334*77334 (77334 the number of weight), and that takes all the memory creating an out of memory issue (picture bellow of the issue) :

My question is the following : is there a way not to create this matrix of size numberOfWeight*numberOfWeight that take all the memory during the training ? Because I don't really understand why we would need to store this array since we only need a 1*77334 size array to store the weight, no ?
Thank you in advance for your answers, and if you have any question or if I wasn't clear feel free to ask me for more information !
Respuesta aceptada
Más respuestas (2)
Greg Heath
el 17 de Jul. de 2020
0 votos
A single hidden layer is sufficient.
Hope this helps
Thank you for formally accepting my answer
Greg
1 comentario
Timothee Fichot
el 17 de Jul. de 2020
Greg Heath
el 17 de Jul. de 2020
0 votos
A single hidden layer is sufficient.
Hope this helps
Thank you for formally accepting my answer
Greg
Categorías
Más información sobre Deep Learning Toolbox en Centro de ayuda y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!