Matlab program out of memory on 64GB RAM Linux but not on 8GB RAM Windows

7 visualizaciones (últimos 30 días)
Lukasz
Lukasz el 20 de Jun. de 2016
Comentada: Lukasz el 22 de Sept. de 2016
I get unexpected out of memory errors when using Matlab 2013a-64bit on a supercomputer. My program uses no more than 5GB of memory, much less than is available on a supercomputer on either RAM or swap space. The RAM on one node of this cluster is 64GB. The same program runs fine on a personal Windows computer with just 8GB RAM. I am unable to check how much memory Matlab can use as command 'memory' is unavailable on Unix platform. My stack space is set to unlimited although I am not sure if this has any impact on Matlab. Could you offer me any assistance? Is there any way to check how much memory can Matlab use and if there is a way to expand it?
To be more specific I get out of memory errors when using "train" function of the neural network toolbox. The error path points to some function with "mex" in its name which stands for Matlab executable like a C program. I wonder if Matlab lets C run some calculations and C runs out of memory. I thought such a scenario would be prevented by my stack space being unlimited. If someone has more experience running Matlab on Linux from a terminal window, I will appreciate any advice.
  2 comentarios
Daniel Walsh
Daniel Walsh el 9 de Jul. de 2016
I have a similar problem to this. Can someone please answer?
Lukasz
Lukasz el 22 de Sept. de 2016
Hi Daniel,
You probably solved your problem but I will let you know what I figured out. I believe neural network toolbox in matlab 2013a is poorly optimized for memory management. If you are also using a computer cluster, look for a newer version of Matlab installed in there. After switching to 2015b my code worked fine.

Iniciar sesión para comentar.

Respuestas (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by