Changing large matrices by not completely loading them into memory

5 visualizaciones (últimos 30 días)
Moritz
Moritz el 18 de Jun. de 2015
Comentada: Walter Roberson el 18 de Jun. de 2015
Hi,
I'm attempting to modify very large matrices (single, 50e3 x 50e3), which don't make sense to load into the memory. I was wondering what you could recommend me as a data handling strategy? I thought ideally I could always load a let us say 100x100 square modify it and write it back. My working machine uses a SSD connected via M2 so it should be relatively speedy (however of course not nearly as fast as RAM). What suggestions do you have?
Thanks,
Moritz

Respuestas (2)

Stephen23
Stephen23 el 18 de Jun. de 2015
Editada: Stephen23 el 18 de Jun. de 2015
You should read TMW's own advice on working with big data:
And in particular you might find memmapfile to be of significant interest to you:
  1 comentario
Walter Roberson
Walter Roberson el 18 de Jun. de 2015
Or instead of memmapfile, save the .mat with -v7.3 and then use matFile objects to read in portions of the array.

Iniciar sesión para comentar.


Alessandro
Alessandro el 18 de Jun. de 2015
Did you check the sparse command out?
  1 comentario
Moritz
Moritz el 18 de Jun. de 2015
Yes I did. However, I believe this only works if you have a considerable amount of zero elements. In my case however, the amount of zero elements are < 5%.

Iniciar sesión para comentar.

Categorías

Más información sobre Large Files and Big Data en Help Center y File Exchange.

Etiquetas

Productos

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by