Slow/Non-Convergence for Large Nonlinear Programming Problem -- Better Way to Solve/Formulate?

2 visualizaciones (últimos 30 días)
I have implemented a program to solve
For this problem, fij's and tau_j's are the variables to be optimized. qobs_jk is a given, and q_jk is calculated as
The MATLAB code to optimize this problem looks something like (thank you Matt J for the help!):
%Initial guesses
sol0.f=f_guess; %np by ni
sol0.tau=tau_guess; %np by 1
%declare optimization variables, bounds, and constraints
f=optimvar('f',[ni,np],'LowerBound',0, 'UpperBound',1);
tau=optimvar('tau',np,'LowerBound',0);
con.sumbound=sum(f,2)<=1;
[p,idx]=prob2matrices({f,tau},'Constraints',con,'sol0',sol0); %convert constraints into appropriate data structure (function by Matt J https://www.mathworks.com/matlabcentral/fileexchange/74481-prob2matrices-a-selective-version-of-prob2struct)
%%Optimization
fun=@(unknowns)CRMPobj(unknowns, idx, injection, production, time, Gamma); %this step creates the function to be optimized
options= optimoptions('fmincon', 'MaxFunctionEvaluations', 10000); %enforce max function evaluations (default is 3000)
[unknowns, obj]=fmincon(fun,p.x0,p.Aineq,p.bineq,[],[],p.lb,p.ub,[],options); %this step solves the optimization
The problem solves correctly and quickly for a simple example problem of ni=5, np=4, and nt=200. However, when moving to a real data set of ni=182, np=300, and nt=25, I fail to see any meaningful progression in the solution up to the maximum number of timesteps is reached (I am working on having MATLAB installed on my company's server, so RAM won't soon be an issue). I have a couple of remarks and guiding questions that may be useful in reaching a solution to this problem (forgive my lack of formal NLP knowledge):
  1. Before running out of memory, I am able to run fmincon() for approximately 20000 iterations. In this time, the solution obtained for f_ij and tau_j is barely (if at all) changed from the initial guesses, and the objective function, accordingly, doesn't change much. Is this lack of change likely (1) because the problem requires many more iterations to converge to a correct solution (hundreds of thousands, millions? does this become a concern of computational time?) or (2) is the function likely getting stuck in a local minimum? If the latter;
  2. Is fmincon() the best way to solve this problem? Are there other local optimizers that are more useful or is a particular global optimizer recommended?
  3. In terms of converging upon a solution, are the number of constraints an issue, or is it likely the number of variables being optimized? I am optimizing ni*np+np variables for ni constraints.
All in all, I am looking to better understand if my limitation in solving this problem for a real data set is due (1) to computation time and memory limits AND/OR (2) due to the nature of the way I have formulated this problem in MATLAB. Any insights into understanding the nature of this problem and how I may reach a solution is much appreciated.
  2 comentarios
Matt J
Matt J el 13 de Jul. de 2021
Editada: Matt J el 13 de Jul. de 2021
Before running out of memory, I am able to run fmincon() for approximately 20000 iterations
It may be good to see what optimoptions you are actually using. With the code you have posted (MaxFunctionEvaluations = 10000) , fmincon could not possibly run for 20000 iterations.
Matt J
Matt J el 22 de Jul. de 2021
What are the initial conditions on the recursion that defines qjk? What is q_j(k-1) when k=1?

Iniciar sesión para comentar.

Respuestas (1)

Matt J
Matt J el 13 de Jul. de 2021
Editada: Matt J el 13 de Jul. de 2021
Before running out of memory, I am able to run fmincon() for approximately 20000 iterations.
It is concerning that you are running out of memory. The amount of consumed memory should normally not increase as the optimization progresses.
In this time, the solution obtained for f_ij and tau_j is barely (if at all) changed from the initial guesses, and the objective function, accordingly, doesn't change much.
It might be worth checking the gradients and the Hessian to see if they have vanishingly small values.
Is fmincon() the best way to solve this problem?
fmincon is the only local optimizer that will solve a problem of this form. A global optimizer might be worth a try (e.g., ga()) if an educated intial guess is hard to derive.
In terms of converging upon a solution, are the number of constraints an issue, or is it likely the number of variables being optimized? I am optimizing ni*np+np variables for ni constraints.
My intuition says that it is a case of vanishing gradients. You'll recall that I had concerns about formulating the optimization in terms of tau, where the function has some very flat areas, as opposed to theta=exp(-delta_t/tau). The increase in the problem dimension may be exacerbating the problem.
  17 comentarios
Matt J
Matt J el 23 de Jul. de 2021
Editada: Matt J el 23 de Jul. de 2021
For example, the Jacobian of q with respect to f.' will be,
Jf=kron((I*M1(tau1_)*M2(tau_2)*M3(tau_3)*...*M(tau_n)).' , speye(np))
and the gradient of z with respect to f will be,
dzdf(:)= 2*I*M1(tau1_)*M2(tau_2)*M3(tau_3)*...*M(tau_n)*(qobj-q).'
Corey Hoydic
Corey Hoydic el 23 de Jul. de 2021
Editada: Corey Hoydic el 24 de Jul. de 2021
I tried parallelizing last night (6 agents) and the run was actually slower, so I would imagine that supplying the gradient is my best bet here to speed it up. I've got a few questions:
  1. Can you give some more info about the form of the sparse upper triangular matrix? I am not sure what a sparse upper triangular matrix that is a function of tau should look like.
  2. What should the size of the gradient be? Recall that my unknowns are f, which is np*ni, and tau, which is np, for a total of np*ni+np unknowns. Should the gradient not be with respect to each of these unknowns, not just f?
  3. qobj and q are np by nt -- does any summation need to occur across time? I'm not sure if the matrix multiplication is adding up otherwise. Hope it's not asking too much, but laying out the dimensionality of each matrix would be extremely helpful.
Apologies for the basic questions. You can tell this is not my area of expertise.

Iniciar sesión para comentar.

Categorías

Más información sobre Linear Least Squares en Help Center y File Exchange.

Etiquetas

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by