GTX1060 for deep learning semantic image segmentation
1 view (last 30 days)
Show older comments
Hello!
I am attempting to train SegNet for semantic segmentation following the example set here: https://www.mathworks.com/examples/matlab/community/24778-semantic-segmentation-using-deep-learning
However, I continue to run into an out of memory error. I am wondering if the error is in my code, or if my GTX1060 3gb GPU is simply not powerful enough to train SegNet as in the example. I have already reduced the mini-batch size to 1, so I'm unsure if there are any other fixes I can make.
Thanks!
0 Comments
Answers (1)
Joss Knight
on 23 Jan 2018
Edited: Joss Knight
on 23 Jan 2018
Yes, 3GB isn't enough for this example, sorry. SegNet is just too high resolution a network. You could try training on the CPU. Alternatively, it could be enough if you were not also driving the display with your 1060.
0 Comments
See Also
Categories
Find more on Deep Learning in Parallel and in the Cloud in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!