'not enough storage' error on xlsread

1 visualización (últimos 30 días)
Jason
Jason el 2 de Feb. de 2011
Comentada: Jim el 30 de Oct. de 2014
Matlab 2010a Windows XP (running under VMWare on a Linux workstation)
I'm trying to read an excel sheet of about 60,000 x 75 values, 21MB file size, using xlsread. I get "??? Error: Not enough storage is available to complete this operation." The Excel file has several sheets in it, but the 60k x 75 is the largest. After some experimentation, here are my observations:
1. I have no problem creating arrays of this size directly from Matlab. This would imply it's not scritcly a memory problem with the resulting array size.
2. I am able to read a different Excel file whose largest sheet is 60k x 21 with no problem. The file in question also has a 60k x 21 sheet, but if I try to read it by specifying the worksheet, it fails, even though I know it can handle 60k x 21. This seems to imply that either the function is always reading the entire file, even when I specify a smaller worksheet, or the file size itself is a factor, even if I'm only trying to read a small subset of the data.
3. If I specify a cell range in the xlsread so I'm reading a subset of the 60k x 75 worksheet, it will work (up to a point of course). If I specify the entire 60k x 75 range, it will fail, as you would expect. Contrary to 2, this seems to imply that the total file size is NOT a factor, only the amount of data you actually want to read. I can also read the data in two steps by specifying the first 30k rows then doing a second read for the last 30k rows and concatenating the resulting arrays back into one.
4. I've had mixed results taking a file that has failed the read and reducing it in size. In one case I reduced a file (by deleting sheets and columns) down to 60k x 21 values in a 4MB file, but the xlsread still failed, even though I can read 60k x 50 from another file with multiple worksheets totaling 16 MB. In other cases, I can take a file that was failing, delete rows and/or sheets, and the xlsread works. Don't know what to make of this.
If it helps, the memory Matlab reports is: >> [uv sv] = memory
uv =
MaxPossibleArrayBytes: 150216704
MemAvailableAllArrays: 784936960
MemUsedMATLAB: 969478144
sv =
VirtualAddressSpace: [1x1 struct]
SystemMemory: [1x1 struct]
PhysicalMemory: [1x1 struct]
>> sv.VirtualAddressSpace
ans =
Available: 784936960
Total: 2.1474e+009
>> sv.SystemMemory
ans =
Available: 3.2114e+009
>> sv.PhysicalMemory
ans =
Available: 1.5888e+009
Total: 3.2081e+009
According to the Matlab help and the memory info, I should be able to make a single array on the order of 18 million values, and 60k x 75 is nowhere near that.
Does anyone have experience reading large spreadsheets using xlsread? Is there a way to force it to use single precision instead of double? I don't actually need to use the full 60k x 292 array, but I was hoping to have Matlab delete rows based on the value in a certain column rather than trying to specify to xlsread a data range to read. Any advice is much appreciated!
  6 comentarios
mohammad
mohammad el 20 de Sept. de 2011
thanks a lot Jan
please see answer and please give your opinion (because of avoiding repetition a question i didn't write new question)
Kirill Andreev
Kirill Andreev el 20 de Jun. de 2012
I also run into this problem on loading a large dataset from Excel. I am loading entire matrix from Excel into a cell array, converting it into a dataset and exporting data into SQL server database after all necessary manipulations. Here is an approximate size of the dataset:
ds =
indic_de: {1220856x1 cell}
sex: {1220856x1 cell}
age: {1220856x1 cell}
geo: {1220856x1 cell}
time: [1220856x1 double]
Value: {1220856x1 cell}
The problem with Matlab was that it had only 35MB for a single array:
>> memory
Maximum possible array: 35 MB (3.716e+07 bytes) *
Memory available for all arrays: 421 MB (4.415e+08 bytes) **
Memory used by MATLAB: 769 MB (8.062e+08 bytes)
Physical Memory (RAM): 3454 MB (3.622e+09 bytes)
What I did is
a) cleared all big variables before loading data from Excel:
clear ds
% load from Excel
b) reduced Java heap memory from 512M to 128M (it helped to get about 80M for arrays)
c) re-started Matlab and run pack on command prompt before running my script
This brought me the following memory:
Maximum possible array: 632 MB (6.631e+08 bytes) *
Memory available for all arrays: 1437 MB (1.507e+09 bytes) **
Memory used by MATLAB: 385 MB (4.037e+08 bytes)
Physical Memory (RAM): 3454 MB (3.622e+09 bytes)
* Limited by contiguous virtual address space available.
** Limited by virtual address space available.
Physical Memory (RAM): 3454 MB (3.622e+09 bytes)
I had to re-write code a little for loading data into database as now I am running into Java heap memory problems but the rest went more or less smoothly.
Kirill

Iniciar sesión para comentar.

Respuestas (3)

Nick Haddad
Nick Haddad el 3 de Oct. de 2014
This issue is a known bug in MATLAB and has been addressed in the following bug report:
The bug report has a workaround which you can install for MATLAB R2013a through R2014b.
  1 comentario
Jim
Jim el 30 de Oct. de 2014
I installed this patch and although it works if you read the entire sheet, I was still getting an error if I tried to limit the range. Matlab is now aware of this continuing issue but I don't see a modification yet to the bug report. The limitation to reading the entire sheet is not a big problem because the range limitations can be imposed after the read by taking a subset of the resulting array.

Iniciar sesión para comentar.


mohammad
mohammad el 20 de Sept. de 2011
are these solutions helpful?
1-MATLAB: file>>preferences>>General>>Java Heap Memory (increasing this)
2-OS (WIN): increasing virtual memory
  3 comentarios
Jan
Jan el 20 de Sept. de 2011
I'd expect the *decreasing* the Java heap memory releases some memory, which could be used by Matlab to import the Excel tables. You can try to increase the virtual memory. But this will not really help: Reading from the harddisk is 1000 times slower than accessing the RAM. So if your program runs in 10 minutes if it fits into the memory, it will run for 167 hours when the virtual memory is used. The energy costs will exceed the costs of buying more RAM soon.
mohammad
mohammad el 20 de Sept. de 2011
Thanks Jan

Iniciar sesión para comentar.


Pian Xiaochuan
Pian Xiaochuan el 9 de Abr. de 2014
Try use the command "importdata" other than "xlsread". I also met this problem and found the following article: http://www.mathworks.de/matlabcentral/newsreader/view_thread/326745
Then I use importdata. Problem solved:)

Categorías

Más información sobre Startup and Shutdown en Help Center y File Exchange.

Productos

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by