Minimizing a prebuilt cost function

I hope this reaches everyone well.
I have been attempting to minimize a complex function, deependent on a 6x7 inital guess matrix. I have built code that will output a weighted least squares difference between the expiremental and predicted data. Is there a way to use fmincon, fminsearch, etc... to minimize this value formed via the cost function?
To sumarize, I have a model that I transformed into a function with its only input being that 6x7 inital guess matrix, which outputs a value that exhibits the difference between the numerical simulated and expiremental. I wish to minimize this value, using fmincon, or any other solver to form guesses input into this function.
Thank you for your time!
Kevin

10 comentarios

Torsten
Torsten el 9 de Feb. de 2023
So your model tries to identify 6*7 = 42 model parameters ?
You should reduce the number of unknowns to be fitted drastically before using an optimizer.
Kevin Hanekom
Kevin Hanekom el 9 de Feb. de 2023
Editada: Kevin Hanekom el 9 de Feb. de 2023
Thank you for the input, many of the parameters are determined prior to using an optimization function. I will figure out a way to eliminate those prior determined wheverever possible. But while I am doing that, is there a way to optimize, for instance, a 6 or 7 variable function that outputs the mentioned cost function?
Torsten
Torsten el 9 de Feb. de 2023
Editada: Torsten el 9 de Feb. de 2023
Sure. Use "lsqnonlin".
It requires that - given a vector of parameters - you supply the differences between numerically simulated and experimental data. The solver tries to adjust the parameters such that the sum of the differences squared is minimized.
Kevin Hanekom
Kevin Hanekom el 9 de Feb. de 2023
Editada: Kevin Hanekom el 9 de Feb. de 2023
Great, In the below code I am attempting to input our guess into the prior mentioned "TsWuSph" function, which outputs the predicted numerical data, and subtracts it from known expiremental data. However it seems the lsqnonlin doesnt iterate at all? The solution instantly converges on the value I input as a guess. Do you see where I am going wrong?
%% Organizing all values into a "Guess" matrix
Guess = [F(1,2)];
[x, fval] = lsqnonlin(@(x0) (abs(TsWuSph(x0))-abs(cfinal(3,3))),Guess)
Torsten
Torsten el 9 de Feb. de 2023
So "TsWuSph" is a function that - given values for x0 - returns your model values ?
And you know that cfinal(3,3) is just one element of the matrix "cfinal", namely the element at position (3,3) ?
And "guess" is a numerical object of the same size as x0 that supplies initial values for the parameters ?
And you should return just TsWuSph(x0) - cfinal(3,3) if this is really what you want to minimize (I doubt it !).
Kevin Hanekom
Kevin Hanekom el 9 de Feb. de 2023
Yes to all! The absolute difference between, TsWuSph(x0) - cfinal(3,3), is what I wish to minimize.
Kevin Hanekom
Kevin Hanekom el 9 de Feb. de 2023
Editada: Kevin Hanekom el 9 de Feb. de 2023
Here is my latest attempt with fmcon,
Guess = [F(1,2)];
options = optimoptions('fmincon','Display','iter','Algorithm','active-set');
[x, fval] = fmincon(@(x0)(TsWuSph(x0)-cfinal(3,3)),Guess,[],[],[],[],[],[],[],options)
And here is the output.
Max Line search Directional First-order
Iter F-count f(x) constraint steplength derivative optimality Procedure
0 2 -52.2326 0
Local minimum found that satisfies the constraints.
Optimization completed because the objective function is non-decreasing in
feasible directions, to within the value of the optimality tolerance,
and constraints are satisfied to within the value of the constraint tolerance.
<stopping criteria details>
Is it possible the solution is digging itself into a local minimum?
Torsten
Torsten el 9 de Feb. de 2023
We don't know F(1,2).
We don't know TsWuSph.
We don't know cfinal.
So we cannot tell you anything about what happens and what possibly has to be changed.
Matt J
Matt J el 10 de Feb. de 2023
Editada: Matt J el 10 de Feb. de 2023
Yes to all! The absolute difference between, TsWuSph(x0) - cfinal(3,3), is what I wish to minimize.
Since cfinal(3,3) is a scalar value, that would be equivalent to solving for multiple unknowns x0 given a single equation. It is a considerably under-determined problem.
Kevin Hanekom
Kevin Hanekom el 10 de Feb. de 2023
Editada: Kevin Hanekom el 10 de Feb. de 2023
Thank you for the input Matt. I apoligize for the confusion, in this case x0 is a single variable, I am inputing into the function I have defined called TsWuSph. This function outputs an expected numerical value, which I wish to minimize in comparison to expiremental, scalar, value cfinal(3,3). Just to sumarize, x0 should only be a single unkown output in this case.

Iniciar sesión para comentar.

 Respuesta aceptada

Kevin Hanekom
Kevin Hanekom el 10 de Feb. de 2023

0 votos

My probelm was a classic example of derivative based algorithms convergence to a local, but not global minimum. To solve this one can use a heuristic, or population based algorithm, in this case either GA or the annealing method as listed in this great textbook, MIT Book.
Thank you everyone for your help.

2 comentarios

It is really unlikely you would do that just to avoid local minima for a 1-parameter problem. You would probably just sample the function over a range of points and use min
c=cfinal(3,3);
fun= @(x0) abs(TsWuSph(x0)-c);
x=linspace(a,b);
[~,i]=min(arrayfun(fun, x));
Guess=x(i);
Kevin Hanekom
Kevin Hanekom el 10 de Feb. de 2023
The one parameter problem was just a inital simplification of the much more complex probelm statement. I am sure the min function would work for the one parameter problem. Thank you for your help through my problem.

Iniciar sesión para comentar.

Más respuestas (1)

Matt J
Matt J el 10 de Feb. de 2023
Editada: Matt J el 10 de Feb. de 2023
Just to sumarize, x0 should only be a single unkown output in this case.
If so, both lsqnonlin and fmincon are overkill. You should just use fminbnd or fminsearch, e.g.,
c=cfinal(3,3);
[x, fval] = fminsearch( @(x0) abs(TsWuSph(x0)-c) , Guess)

2 comentarios

Thank you for the input, using this I recieved the following output.
Elapsed time is 0.202791 seconds.
Iteration Func-count min f(x) Procedure
0 1 52.2326
Elapsed time is 0.169855 seconds.
1 2 52.2326 initial simplex
Optimization terminated:
the current x satisfies the termination criteria using OPTIONS.TolX of 1.000000e-04
and F(X) satisfies the convergence criteria using OPTIONS.TolFun of 1.000000e-04
It seems the code is not attempting to minimize f(x), which unless I am mistaken, should be attempting to get as close to 0 as possible.
Here is the exact code I used.
%% Organizing all values into a "Guess" matrix
Guess = [F(1,2)];
options = optimset('Display','iter');
c=cfinal(3,3);
[x, fval] = fminsearch( @(x0) abs(TsWuSph(x0)-c) , Guess, options)
Matt J
Matt J el 10 de Feb. de 2023
Editada: Matt J el 10 de Feb. de 2023

Iniciar sesión para comentar.

Productos

Versión

R2021a

Preguntada:

el 9 de Feb. de 2023

Comentada:

el 10 de Feb. de 2023

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by