How implement constrain of being positive in Nelder-Mead Method?

7 visualizaciones (últimos 30 días)
mosayyeb
mosayyeb el 3 de Mzo. de 2014
Comentada: mosayyeb el 7 de Mzo. de 2014
Hello to ALL. I have a function that is not smooth and continious. so I want to use Nelder-Mead Method for optimizing. But my parameters should be positive. How should I do it?
Thanks Perviously

Respuestas (3)

John D'Errico
John D'Errico el 3 de Mzo. de 2014
Editada: John D'Errico el 4 de Mzo. de 2014
Use fminsearchbnd, which allows bound constraints in fminsearch.
In the case of simple lower bound constraints, it actually uses the trick suggested by Matt, but it does the bookwork for you.
As far as the function not being smooth or even continuous, be VERY careful here. Non-smooth objectives will sometimes work acceptably, as Nelder-Mead schemes are a bit more robust to problems than are those that will use derivatives or will try to form finite difference approximations for that purpose. In fact, most optimizers do exactly that, so they can have problems with non-smooth surfaces.
Even so, Nelder-Mead schemes may still get stuck or lost on strongly non-smooth objectives. And as far as discontinuous objectives, expect problems, and serious ones. Good starting values will be imperative. (Note that a simple boundary discontinuity, such as imposed by a penalty function approach is not that terrible. Nelder-Mead can take that in stride, and backs off. But in internal discontinuity may be far more difficult to overcome, as the simplex scheme will not be able to see jump discontinuities in the objective.)
Often a better choice for the nastiest objectives is some form of stochastic optimizer, such as genetic algorithms, simulated annealing, or one of the many variations on particle swarm optimizers. These tools will be less sensitive in general to such problems, although sometimes nothing may work on the truly nasty problem.
  3 comentarios
John D'Errico
John D'Errico el 6 de Mzo. de 2014
I'm not sure what I said about it, but that is really an illusion. You don't actually get usefully distinct multiple solutions, as they all map to the same thing. For example, If you have a lower bound constraint, then fminsearchbnd internally performs that x.^2 transformation. So it might converge to -2 or 2 internally, but either value of that internal parameter maps to 4, and 4 is the value that will be used by your objective function. In effect, the user provided objective function never sees any value other than the solution, here 4. Fminsearchbnd takes care of squaring the result when returned to the user, so they see it is 4, not the value that fminsearch converged to, which was either -2 or +2.
So again, this is not a true multiplicity, since the user and their objective function only ever see the positive (squared) parameter.
Sorry, I will not provide an example, because it would not show anything pertinent. The code works as designed, and you need do nothing special except provide your bound(s) to the code.

Iniciar sesión para comentar.


James Tursa
James Tursa el 3 de Mzo. de 2014
Another trick is to return f(x) = infinity whenever any of the x(i) are not positive. Have your starting guess all positive and f(starting guess) = finite.

Matt J
Matt J el 3 de Mzo. de 2014
A popular trick is to change all your variables to squares, i.e., instead of minimizing f(x), minimize f(y.^2) where x=y.^2 is a change of variables. If you try this, it is advisable not to initialize with y0=0.

Categorías

Más información sobre Quadratic Programming and Cone Programming en Help Center y File Exchange.

Etiquetas

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by