GPU programming for Mac M1
108 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Matilda
el 19 de Nov. de 2024 a las 13:07
Editada: Walter Roberson
el 13 de Dic. de 2024 a las 19:23
Hello!
I was wondering whether anyone has found a way to use functions such as gpuArray on Matlab M1 GPUs.
Alternatively is there other ways to perform computations on the GPU for MacOS. I would appreciate any ideas/information.
Does Matlab have any plans to extend their GPU programming to non NVIDIA GPUs?
Apologies if this has been asked before, all posts I found were quite dated.
3 comentarios
John D'Errico
el 19 de Nov. de 2024 a las 17:28
It is something I am sorry to have said, since I use only Macs for my work myself. As Mac users, will all keep on hoping for the best though.
Respuestas (2)
Mike Croucher
el 13 de Dic. de 2024 a las 14:44
I have an M2 Mac and I have just bought an M4 Mac for my wife. I love the hardware on these machines, its superb! I wrote the blog posts that announced the betas for Apple Silicon and also showed how to switch to Apple Accelerate around this time last year Apple » The MATLAB Blog - MATLAB & Simulink. I, along with many other MathWorkers, am invested in this platform.
I would love to see MATLAB have support for Apple GPUs and I help development keep track of requests from users.
First off, I disagree with @Walter Roberson, our GPU support is not primaraily aimed at Deep Learning support. We have over 1200 gpuArray enabled functions spread across 14 toolboxes and more are being added with every release. MATLAB now has over 1,000 functions that Just Work on NVIDIA GPUs » The MATLAB Blog - MATLAB & Simulink. At least one extra toolbox will be getting GPU support in 2025a that I know of.
So, let's think about gpuArray support first. Most uses of technical computing outside of deep learning use double precision. MATLAB's default data type is double. MATLAB users expect double. Apple silicon GPUs do not support double. This is a problem!
OK, so when you have this conversation with people there will be a subset who will say 'I'd be happy with single'. OK, great. To do what? What workflow do you have right now that you need this support for? What functions would you need to see supported? Have you ever run this on an NVIDIA GPU and got a speed-up? Do you have any evidence that the Apple silicon GPU would actually help here? By how much?
The answers to all of these questions help drive conversations internally. Providing support for Apple Silicon GPUs would be a major undertaking and doing it would mean that something else wouldn't get done. More likely it would be a lot of 'something else's'!
Of course I can't say if this support will ever come or not but I know that detailed conversations on what is wanted and why is the way to help out.
1 comentario
Walter Roberson
el 13 de Dic. de 2024 a las 19:07
Editada: Walter Roberson
el 13 de Dic. de 2024 a las 19:23
our GPU support is not primaraily aimed at Deep Learning support.
Mathworks has told me this (that it is aimed at Deep Learning) directly.
Walter Roberson
el 19 de Nov. de 2024 a las 19:03
If I recall correctly, someone posted indicating that they had generated mex C++ code that calls into Apple's GPU routines, and invokes the code from within MATLAB. This is not the same as using gpuArray() with automatic dispatch to GPU as needed. As far as I recall, the person had not made the interface code publicly available.
Does Matlab have any plans to extend their GPU programming to non NVIDIA GPUs?
The last time I asked Mathworks about this, the answer was that they had no plans to extend GPU programming to Apple Silicon.
It is difficult to get straight answers from Apple about the best way to use the GPU.
Apple has a history of leaving pieces of technology undocumented, and letting ecosystems of best-effort grow up, only to later deliberately break the best-effort code, saying, "We never said to do it that way so any problems are your fault!"
But also, Mathworks GPU support is primarily aimed at Deep Learning support. Mathworks chases the Deep Learning market. The current research work that is using Apple Silicon GPU is a comparatively small portion of research work. The majority of research work is on NVIDIA GPU; the second largest group of Deep Learning research work is on IBM equipment; other groups are fair behind on portion of the market. By market share measures, Mathworks would be better off going after support for IBM equipment.
0 comentarios
Ver también
Categorías
Más información sobre GPU Computing en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!