How do I use multiple GPU for GAN
1 visualización (últimos 30 días)
Mostrar comentarios más antiguos
In the example metioned on MATLAB "Train Generative Adversarial Network (GAN) - MATLAB & Simulink (mathworks.com)" how/where should the code be changed so as to use multiple GPU's?
Though, "auto" is used, all the GPU's are not being used by default. I have 4 gpu's and want to use them all.
1 comentario
Shuaibin WAN
el 25 de Nov. de 2021
Hi Shaw,
I also encounter this problem. Did you have any solution now?
Many thanks!
Respuestas (2)
Antti
el 12 de Oct. de 2021
Editada: Antti
el 12 de Oct. de 2021
Hi! You should change 'ExecutionEnvironment' option to 'multi-gpu'. More info here. Before doing that, you might want to check if your GPU's are detected b
>> numGPU = gpuDeviceCount("available")
you don't get 4 as a result, then your GPU's are not supported by MATLAB, or there's a driver issue. Please accept my answer formally if this worked for you.
Antti
el 12 de Oct. de 2021
It appears that when using custom training loops (as in the example), "multi-gpu" option is not supported. However, you can still take advantage multiple GPUs, by launching parallel MATLAB workers, where each worker will use a GPU of its own. See this example: https://se.mathworks.com/help/deeplearning/ug/train-network-in-parallel-with-custom-training-loop.html. Please formally accept my answer if this solves your problem.
0 comentarios
Ver también
Categorías
Más información sobre GPU Computing en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!