What can I use instead of 'lsqcurvefit'
35 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Hello everyone,
I am using 'lsqcurvefit' for nonlinear curve-fitting. The problem is that I always get the message of "local minimum possible" or "solver stopped prematurely". In addition when I change the initial values, the result change. I don't know what shoud I do to solve this problem and avoid local minima. I also changed the FunctionTolerance and other options to solve this issue but it didn't help. Could you please help me how to solve this problem.
Thanks,
Faezeh
0 comentarios
Respuestas (3)
Star Strider
el 17 de En. de 2022
The ga function in the Global Optimization Toolbox does not use gradient descent and is therefore more resistant to getting hung up in local minima. The fminsearch function (core MATLAB) also is derivative-free, and (providing that the number of parameters to be estimated is less than about 7) may be able to estimate the parameters reliably.
2 comentarios
Star Strider
el 18 de En. de 2022
My pleasure!
The ga solver may require a large number of generations in order to converge on an acceptable solution. It will generally find the best, however like all optimisation routines, its success is highly dependent on how well the problem is posed, so the model it fits needs to represent the process that created the data. The IniitalPopulationMatrix also needs to allow for the scaling of different parameters, if necessary. This is relatively straightforward, however requires some advance knowledge of the parameter magnitudes. If all the magnitudes are within about from each other, this should not be a problem. However if some are significantly greater than the others ( or so) it may be difficult for any oprimisation routine to find acceptable parameters.
If lsqcurvefit gives reasonable but suboptimal results, use those rewults along with the InitialPopulationMatrix to give ga some advance knoiwledge on the relative magnitudes of the parameters.
.
Walter Roberson
el 17 de En. de 2022
lsqcurvefit()'s normal output is "local minima possible".
The only time you do not get that message is if the fitting is being stopped because of something going wrong (wrong number of values returned, for example), or because nan/inf checking is turned on and you encounter one of those -- or because fitting determines that the minima is being blocked by a boundary condition you have set.
Suppose you ask to fit fit something extremely simple such as @(x)x.^2 - 1 . The results quickly converge to x == 1 (or x == -1) . But can lsqcurvefit know that you have encountered a global minima for that function? NO. lsqcurvefit() has no ability to tear apart the function handle and do calculus on the formula to prove that it really is a global minima. lsqcurvefit() does not know that your function was not, for example, @(x)x.^2 - 1 - (x == -732431.18935)*10^20 which has a global minma at -732431.18935
So lsqcurvefit() does not say "problem solved" unconditionally: it just says that the location it found looks good... but that it might have missed something.
0 comentarios
John D'Errico
el 17 de En. de 2022
A solver CANNOT know if it is in the global minimum or a local minimum. Even a putatively "global" solver cannot be certain except for very specific problems where you know a lot about the problem.
So lsqcurvefit tells you it thinks it found a solution, that may be a local min. It is just being accurate. The problem is in your perception of that announcement, not necessarily in the result itself.
In the case of "solver stopped prematurely" this may indicate a problem with the model, or with your starting values, or with your data. Since we see none of these thigns, we are left unable to help you there.
In general any solution is no better than the starting values you give it, and the ability of the model to adequately represent the data is it being fit unto. So if you are unhappy with the results of the fit, you might look more carefully at the model. Does it really represent that data you want it to fit? If you still believe it does, yet you are unhappy, then look at your starting values. Need a better fit? Then choose more intelligent starting values. This can be especially important for certain classes of model, but also for problems with large noise in the data. In the latter case, what you really need to do most then is to get better data.
Sorry, but if you want better help, then I would strongly suggest providing a good example of your data, as well as the model you use to fit it, and the code you wrote.
0 comentarios
Ver también
Categorías
Más información sobre Get Started with Curve Fitting Toolbox en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!