I have a Matlab program which runs the same calculation on alot of different data sets, and after each data set, outputs the results to an output file. When I start running my program, my system has about 7 GB of free space left. However, after several data sets are run, Matlab stops with an error "no space left on device" when creating an output file.
However, when this happens I find that a "du" command does show that almost all the space on my drive is used, but the total of the output files are only about 200 MB. Then when I close Matlab and run du again, it again shows I have about 7 GB of free space.
It seems that what is happening is that somehow Matlab itself is taking up lots of hard drive space while it is running. (This doesn't happen right when I start Matlab, only after my program is running for a while.) However the workspace does not appear to contain any variables. It seems likely there is some sort of memory leak in my program.
I tried to use the "memory profiler" (profiler -memory on) and it didn't seem to tell me whatI wanted to know: even after running the program for long enouugh that about 800 MB of hard disk space was "lost", the total "Allocated memory") in the main function) was only about 10 MB.
I also did some more research and found that sometimes memory fragmentation can be a problem. So I tried to use the "fragmem" function to fragment memory found here: fragmem function
But that didn't seem to reproduce the results (i.e. the free space used when I call df didn't change)
How do I debug this issue? Is there a way to find out why Matlab seems to be consuming the disk space?
(Another thing that might help: I experienced this issue while running my program on CentOS, but not while running the same Matlab program, using the same Matlab version (R2013b) on Ubuntu.)