gpuArray size limit is much less than the GPU maximum memory

72 visualizaciones (últimos 30 días)
z223641s
z223641s el 29 de Oct. de 2024 a las 0:30
Comentada: z223641s el 30 de Oct. de 2024 a las 15:51
When I work on using gpuArray to speed up my computation, I vectorized all my parallel element-wise operations and then obtained 2 big matrices as gpuArrays, to perform some basic multiplications/summations, etc. I thought this might be the better way to go compared to use the arrayFun. It did work well, and, to further speed up, I want to maximize the memory usage of the GPU. This means, instead of performing matrices operations several times, it will be better to perform bigger matrix operations only one time. Then I found matlab has some limit on the array size as intmax('int32'). However, this number is much less than the maximum GPU memory of my device. So I wonder if there is a way to overcome this.

Respuestas (1)

Joss Knight
Joss Knight el 29 de Oct. de 2024 a las 9:51
  1 comentario
z223641s
z223641s el 30 de Oct. de 2024 a las 15:51
Thank you for the guideline. So I think at present I will just split the arrays within the int32 index limit. However, this then requires a repeated process for loading the splitted data one by one and then compute, although by theory I think it is feasible to do these repeated processes in parallel since the memory is enough. For the computation part I tried using the splitted data with arrayfun feature and did achieve to compute all the splitted data on GPU at the same time. However, I have trouble in loading the splitted data to GPU in parallel. I strated by using a for-loop to load the splitted parts one by one to GPU, and later I changed the for to par-for loop, but this seems not working.

Iniciar sesión para comentar.

Categorías

Más información sobre GPU Computing en Help Center y File Exchange.

Etiquetas

Productos


Versión

R2023a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by