Borrar filtros
Borrar filtros

How to solve out of memory error?

1 visualización (últimos 30 días)
Minu
Minu el 7 de Mayo de 2013
I am doing my project in OCR.For this i am using image size of 64x64 because when i tried 32x32 etc some pixels is lost.I have tried features such as zonal density,Zernike moments,Projection Histogram,distance profile,Crossing .The main problem is feature vector size is too big .I have take the combination of above features and tried.But whenever i train the neural network ,i have got an error "out of memory".I have tried pca dimensionality reduction but its not work good.i didnt get efficency during training.Run the code in my pc and laptop.In both of them i have got same error.my RAM is 2GB.so i think about reducing the size of an image.is there any solution to solve this problem.
I have one more problem whenever i tried to train the neural network using same features result is varied.how to solve this also?

Respuesta aceptada

Greg Heath
Greg Heath el 7 de Mayo de 2013
Of course pixels are lost when you reduce the size. I am not an expert in imagery, therefore I cannot confidently suggest another method. However, there must be several acceptable ones available. Why don't you submit a post on image feature reduction?
The NNs in the NNTBX randomly divide data and randomly initialize net weights. If you want to reproduce a design, set the RNG to the same initial state as before.
Hope this helps.
Thank you for formally accepting my answer
Greg

Más respuestas (1)

Jan
Jan el 7 de Mayo de 2013
What about installing more RAM?

Categorías

Más información sobre Image Data Workflows en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by