minimize a function fast

Hi!
I am trying the minimize a function of the following form: (Y - exp(a + b_1 X1 + b_2 X2)).^2
Y, X1, and X2 are all vectors and I am trying to find a, b_1, and b_2 that minimize this function. Its basically a nonlinear regression. So far I always used fminunc but it is very slow. I need to do this many times, so my program runs for more than a day. I appreciate your help. Thank you!

 Respuesta aceptada

Matt J
Matt J el 23 de Oct. de 2014
Editada: Matt J el 24 de Oct. de 2014

3 votos

  1. lsqcurvefit is better for a problem of that form.
  2. Are you supplying your own derivative calculations using the GradObj option (and Hessian option if applicable)? You should do so, since the analytical derivatives are easy here. With lsqcurvefit, there are similar options, e.g. Jacobian.
  3. How are you initializing the optimization? Because your model is loglinear, it is likely that the initial guess as generated below will be more effective than random guessing.
n=numel(Y);
x0=[ones(n,1) X1(:), X2(:)]\log(Y(:)); %x0=[a;b_1;b_2]

1 comentario

Kim
Kim el 24 de Oct. de 2014
Dear Matt, Thank you very much. I implemented your suggestion, including the Jacobian, and it runs much faster. As the initial guess I took the beta's resultsing from the regression of log(Y) on X.

Iniciar sesión para comentar.

Más respuestas (0)

Preguntada:

Kim
el 23 de Oct. de 2014

Comentada:

Kim
el 24 de Oct. de 2014

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by