Out of memory errors when use parfor

8 visualizaciones (últimos 30 días)
XINJIE XING
XINJIE XING el 23 de Mayo de 2020
Editada: XINJIE XING el 24 de Mayo de 2020
Hi there,
I am currently writting codes for for double loops, the outer loop is used for enumerating samples and the inter loop is used for a large optimisation problem with enumerating different input parameters. A minimal reproducible example is as below:
for sss = 1:sample_size
NN_ttt = [];
tf_ttt = [];
th_ttt = [];
day_ttt = [];
Profit_ttt = [];
parfor idx_tfth = 1:size(matrix_day,1)
day = matrix_day(idx_tfth,:)';
tf = matrix_tf(idx_tfth,:)';
th = matrix_th(idx_tfth,:)';
tf_ttt(:,idx_tfth) = tf;
th_ttt(:,idx_tfth) = th;
day_ttt(:,idx_tfth) = day;
[NN_ttt(idx_tfth,1),Profit_ttt(idx_tfth,1)] = f_for(M,tc,sss,A_dist,AC_index_all,AE_index,AG_index,AV_index,cc,customer_index,depot_index,...
lambda_NN,revenue,Ch,Cij,C,Cn,Cp,t_end,planninghorizon,NN_ave, ...
day , tf,th,Chd);
end
id_max = find(Profit_ttt==max(Profit_ttt));
NN_ttt_best = NN_ttt(id_max);
NN_res(sss,1) =NN_ttt_best;
end
In order to fully utilise my cores, I decided to use parfor for the insider loop where each core does optimisation in parrallel with feeding in different parameters from matrix_day. After finish one optimisation, the output parameters from f_for function are very small sized variables (i.e. NN_ttt, Profit_ttt and 400x1 double each). After the best solution is found, then they are immediately initialised for next iteration. The optimisation function is defined by f_for and variables, constraints are defined by Yalmip toolbox, solver is used by Gurobi 9.0.
Once I run the codes, there is a noticeable increase of RAM memory and it took about 50% of my total memory (i.e. 32G), I think this should be fine as of the use of parfor. But as the outer iteration moves on, the total memory usage keeps accumulating. When I increase the outer iteration to 20, I encouter out of memory error. My question is, if my memory is fine for completing one time of all iterations of the inter parfor, then the memory pool should be fine as of memory pool auto release and I only have two very small sized variables are kept (i.e. NN_ttt_best, NN_res, elements of them are very small, which are double integer less than 500), but why when the outer loop carries on, it eats my memory heavily? When we finish the function f_for, shouldn't all the input variables be cleared automatically right? If so, why the out of memory happens? Thank you very much for your time and valuable reply.
  2 comentarios
Johan Löfberg
Johan Löfberg el 24 de Mayo de 2020
I don't know how parfor change things, but a typical mistake is to repetedly define more and more YALMIP variables in a loop insteaf of re-using them. If you cannot refactor your code to avoid that (which you probably cannot as YALMIP cannot be used over parfor loops), you should use yalmip('clear') before setting up the new problem.
XINJIE XING
XINJIE XING el 24 de Mayo de 2020
Editada: XINJIE XING el 24 de Mayo de 2020
Hi Johan,
Thank you very much for your reply. So coincidentlly, I was looking at your research profile and your paper yesterday.
I think I manage to find out the reason why. Because Matlab won't release the memory that allocated to workers after finish one parfor each time but just keep adding more spaces to them. So I end up using delete(gcp) to reset the pool once I finish each parfor. It will cause several seconds delay but it seems the best way to run the codes to an end.
Another thing, yes, you are right, Yalmip unfortunately needs to define variables inside the loop and if you use it over parfor loops, you will have index exceed matrix dimension error.
P.S. your Yalmip toolbox is a masterpiece and thank you so much for such a great tool.

Iniciar sesión para comentar.

Respuestas (0)

Categorías

Más información sobre Loops and Conditional Statements en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by