Borrar filtros
Borrar filtros

lsqnonlin and Jacobian misunderstanding: what is the Jacobian definition ?

10 visualizaciones (últimos 30 días)
Hello !
I use the lsqnonlin Matlab function to fit a curve, called f, to my experimental points (coordinates x_i and y_i). Thus, we have to make simple :
[optimum_result,resnorm,residual,exitflag,output,lambda,jacobian] = lsqnonlin( y_i - f(a,x_i) ) where a is my fit parameter.
I'm wondering what is the definition of the jacobian returned by Matlab :
- the square of the Jacobian returned by lsqnonlin = the second derivative of the residual squared (calculated at the optimum, means the best fit parameter found). Here my residual is : y_i - f(a,x_i). it is the definition found here http://www.ligo-wa.caltech.edu/~ehirose/work/andri_matlab_tools/fitting/MatlabJacobianDef.pdf
OR
- the Jacobian returned by lsqnonlin = the derivative of the residual (calculated at the optimum). It is why I have understand reading Matlab help.
If the answer is the derivative of the residual (calculated at the optimum), I have a misunderstanding. In fact, at the optimum, the sum of my residual vector squared have to be minimum. So the sum of my jacobian, a derivative, has to be equal to (or close to) zero. yes or not ? In Matlab it is not equal to zero, it is why I have a misunderstanding.
Thanks.

Respuesta aceptada

Alan Weiss
Alan Weiss el 5 de Ag. de 2013
You have a slight misunderstanding of what a Jacobian is for a sum-of-squares problem. The definition is here in the documentation.
In detail, the objective function is
sum((F(x,xdata) - ydata).^2)
The Jacobian is
J(i,j) = partial(F(x,xdata)(i))/partial(x)(j)
There is no reason to think that J is near zero at a solution. The gradient of the objective function is something like
2J'*F
and if F is near zero then the gradient of the objective is near zero, but J is not necessarily small.
Alan Weiss
MATLAB mathematical toolbox documentation

Más respuestas (2)

Joffray Guillory
Joffray Guillory el 5 de Ag. de 2013
I use lsqnonlin function (not lsqcurvefit). So, I have something like that (a very simple example with only one fit parameter) :
xdata = [ ... ];
ydata = [ ... ];
g = @(x) x*sin(xdata) - ydata ;
[x,fval,residual,exitflag,output,lambda_fit,jacobian_fit] = lsqnonlin(g,x0,[],[],options);
According to you, my objective function is (I agree):
sum(( x*sin(xdata) - ydata ).^2)
but my Jacobian is:
J(i) = partial( x*sin(xdata) - ydata )/partial(x)
and not :
J(i) = partial( x*sin(xdata) )/partial(x)
because I have for input argument in my lsqnonlin function: x*sin(xdata) - ydata. In other words, the Jacobian returned by lsqnonlin is the derivative of the residual (and calculated at the optimum x0). Is it right ?
  1 comentario
Alan Weiss
Alan Weiss el 5 de Ag. de 2013
It is immaterial whether or not we subtract ydata. For your example,
J(i) = sin(xdata(i))
whether or not ydata is included. In this example, J is a vector of length(xdata) components.
I hope this clarifies the computation.
Alan Weiss
MATLAB mathematical toolbox documentation

Iniciar sesión para comentar.


Joffray Guillory
Joffray Guillory el 6 de Ag. de 2013
Thanks for these answers. Now, it is clear for me.
Joffray

Categorías

Más información sobre Polynomials en Help Center y File Exchange.

Etiquetas

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by