Hello everyone. I use matlab for data classification. When i try to create samples array, i get 'out of memory' error. The array size is 25920x1296 data dype - double. So it takes 268.7 Mb. How can i change max matlab array size? Or why this error occurs? I tried to change type of array to int or single, but it didn't works.
Thanks.

 Respuesta aceptada

Matt Tearle
Matt Tearle el 25 de Feb. de 2011

0 votos

Technically, 25920-by-1296 is 256.3Mb, but let's not quibble over a few meg :)
If you're on a Windows machine, type memory to see what the maximum available variable space is. MATLAB doesn't restrict arrays -- it's really a restriction of your system. So unless you can clear up space in some way, your only option is to crunch down the data or add more memory.
That said, look at doc memory for some ideas.
How are you creating the array? When you say you tried to change type, how did you do that?

1 comentario

Walter Roberson
Walter Roberson el 25 de Feb. de 2011
If Alex is getting Out Of Memory for an array that small, then chances are quite high that Alex is using a 32 bit version of Matlab, in which case Matlab *does* restrict arrays. rand(10000000) is not going to complain about "out of memory" on such a system: it is going to complain that the number of elements in the array is too high.
Not that this distinction makes any practical difference on a system that can't hold an array that size...

Iniciar sesión para comentar.

Más respuestas (1)

Alex Hoppus
Alex Hoppus el 25 de Feb. de 2011

0 votos

"How are you creating the array? "
I don't know the size of array, so i fill it in this way:
first initialization: array=zeros([],size of descriptor vector);
then filling: array(end+1,:)=descriptor vector;
"When you say you tried to change type, how did you do that? "
in this way: single(array) uint8(array)
So why the type changing doesn't affect to this error? I thought that type changing may solve this problem, because integers takes less memory... or what?

3 comentarios

Matt Tearle
Matt Tearle el 25 de Feb. de 2011
OK, so I assume you're filling the array inside some kind of loop (presumably a while loop) in which you're doing ... something. One thing I was trying to establish is whether you're reading data from a source or creating it in some way. It seems like the latter. Now, the lack of preallocation of space may be causing some of the problem. As you may know, MATLAB requires arrays to be stored in contiguous memory blocks, which is why the first line from the memory command gives the largest possible array (which is smaller than the total available). When an array is increased in size and runs out of space in its current memory location, it has to be copied somewhere with enough room. I don't know the underlying details of how that's done, but it's possible that that's causing some of your pain. Is there any way you can get some kind of likely upper bound on the number of rows? If you can, and it will fit into memory, I think you'd be better off preallocating, then deleting unused rows afterward.
Aside: is there any reason your data has to be structured by rows? If you could make the descriptor vectors the columns of the matrix, that would very likely make your code run faster. It *might* help with the memory issues (again, related to the resizing problem).
Changing a double to something else is a bit "closing the barn door after the horse has bolted". Try making the array single/uint8 to start with, then adding single/uint8 vectors to it, so you don't have to go via double.
array = zeros(...,'single');
array(end+1,:) = single(descriptor vector);
(Or if you can make the descriptor vector native single/int, even better.)
Walter Roberson
Walter Roberson el 25 de Feb. de 2011
Also, consider using cell arrays to build up the array. Once it is completely built, you can cell2mat() it. This won't get around the size problems, but will be more efficient than growing a numeric array. Pre-allocating will be faster.
Matt Tearle
Matt Tearle el 25 de Feb. de 2011
I was thinking of avoiding that route because it would make the memory problem worse when you do cell2mat... but actually Walter's suggestion does bring up another possibility (dirty, but maybe effective). If the issue is, for whatever reason, the maximum single array size, rather than the overall memory, you might be able to store the data as a cell array, then just access everything with indexing (and probably lots of nasty loops). IANA developer, so don't take this as gospel, but I do know that cell arrays have some memory overhead, but they effectively act like an array of pointers to other memory locations. So if each row of your array was a cell, you'd need space for a 25920-by-1 cell array, which I think would be about 1.5Mb of contiguous space. Then all the elements of the cells could be stored wherever, each needing only 10K of contiguous space. (NB: overall that's more memory, but fragmentation wouldn't be so problematic). I don't really know if that will help, but it's an idea.

Iniciar sesión para comentar.

Categorías

Más información sobre Loops and Conditional Statements en Centro de ayuda y File Exchange.

Etiquetas

Preguntada:

el 25 de Feb. de 2011

Editada:

el 24 de Oct. de 2020

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by