GTX1060 for deep learning semantic image segmentation
1 view (last 30 days)
I am attempting to train SegNet for semantic segmentation following the example set here: https://www.mathworks.com/examples/matlab/community/24778-semantic-segmentation-using-deep-learning
However, I continue to run into an out of memory error. I am wondering if the error is in my code, or if my GTX1060 3gb GPU is simply not powerful enough to train SegNet as in the example. I have already reduced the mini-batch size to 1, so I'm unsure if there are any other fixes I can make.