GTX1060 for deep learning semantic image segmentation

1 visualización (últimos 30 días)
TopologicalSphere
TopologicalSphere el 22 de En. de 2018
Editada: Joss Knight el 23 de En. de 2018
Hello!
I am attempting to train SegNet for semantic segmentation following the example set here: https://www.mathworks.com/examples/matlab/community/24778-semantic-segmentation-using-deep-learning
However, I continue to run into an out of memory error. I am wondering if the error is in my code, or if my GTX1060 3gb GPU is simply not powerful enough to train SegNet as in the example. I have already reduced the mini-batch size to 1, so I'm unsure if there are any other fixes I can make.
Thanks!

Respuestas (1)

Joss Knight
Joss Knight el 23 de En. de 2018
Editada: Joss Knight el 23 de En. de 2018
Yes, 3GB isn't enough for this example, sorry. SegNet is just too high resolution a network. You could try training on the CPU. Alternatively, it could be enough if you were not also driving the display with your 1060.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by