Borrar filtros
Borrar filtros

Best way to integrate GPU use in my code?

2 visualizaciones (últimos 30 días)
AlexRD
AlexRD el 18 de Mayo de 2021
Comentada: Infinite_king el 18 de Abr. de 2024
I've started doing a lot of work on a neural net implementation i've built from scratch using Matlab, and initially changed from using GPU to using CPU only as it was easier to debug and write code for, and would allow me to focus on the GPU aspect of it later.
I am now however on the GPU implementation part, but struggling a bit to get an optimized result. I noticed that the GPU struggles a lot with multiple layers, with the processing time often being directly proportional to how many layers I have, whereas the CPU doesn't really care about number of layers (as long as amount of neurons aren't crazy high) but struggles a bit with the input layer, considering the amount of weights and biases.
I've tried a hybrid approach, where the input and any convolutional layers are assigned to the GPU, the GPU data is then fetched and processed by the CPU. But often the fetch time isn't worth the hassle.
Some feedback would be very welcome, and my project can be found here, fully documented: https://github.com/AlexRDX/Neural-Net
Or attached to this post. Any criticism at all is welcome.
Thank you for your time!

Respuestas (0)

Categorías

Más información sobre Deep Learning Toolbox en Help Center y File Exchange.

Productos


Versión

R2021a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by