Confidence intervals for fitting parameters in a system of ODEs?
20 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Hayat Adawi
el 13 de Mzo. de 2021
Respondida: Bjorn Gustavsson
el 14 de Mzo. de 2021
tl;dr: I have successfully fit a set of experimental data to a system of complex nonlinear ODEs, to yield me several fitted parameters, and now I am looking for a way to define confidence intervals for those parameters. Any suggestions?
Details/Background: I have a set of data of time (independent variable) and 3 concentrations (dependent variables) to model the simultaneous formation of 3 chemicals in a reaction. The temporal concentrations of the three chemicals ("C1, C2, C3") are modeled with a set of 3 ODEs ("dC1/dt, dC2/dt, dC3,dt") that I consolidated as part of a function. I've already been able to fit the experimental data to the system of ODEs by combining the ODEs into a function and passing that function to lsqcurvefit. So now I have a set of parameters that I've solved for. I'm struggling to find a way to extract confidence intervals, as "fit" and "fittype" which give confidence intervals do not seem to work with systems of ODEs.
0 comentarios
Respuesta aceptada
Star Strider
el 13 de Mzo. de 2021
The approach in Parameter Estimation for a System of Differential Equations uses the lsqcurvefit function, and that will work with nlparci to produce the necessary parameter confidence intervals you may want to calculate. (Consult the relevant documentation.)
I do not have the Curve Fitting Toolbox (I do not need it for the sorts of parameter estimations I do, since the Statistics and Machine Learning Toolbox and the Optiomization Toolbox and Global Optimization Toolboxes do what I want), so I cannot help you with it.
0 comentarios
Más respuestas (1)
Bjorn Gustavsson
el 14 de Mzo. de 2021
Your model-function seems to produce the residuals, , between modeled and observed concentrations. At the optimal values you get the parameter uncertainties, , for the optimisable parameters ,, from:
where is the covariance-matrix of the "observables" and J is he Jacobian of the observable with respect to the parameters:
...if you can get away with linearizing the model around the optimal point.
What I do (normal-distributed uncorrelated observables, so is a diagonal matrix with the variances on the diagonal), for some of my problems (weighted least squares) is something along this line:
par_best = [1 123, 12]; % some arbitrary values
dpar = [0.1, 10, 1]; % suitable step away from optimal point
Sigma_obs = diag(12*abs(randn(123,1))); % an arbitrary measurement covariance matrix
res_opt = model_fun_res(par_best); % residuals with the optimal parameters
for iPar = 1:numel(par_best)
curr_par = par_best;
curr_par(iPar) = curr_par(iPar) + dpar(iPar); % step away in one parameter
res_curr = model_fun_res(curr_par); % residual
J(:,iPar) = (res_curr - res_opt)/dpar(iPar); % column of numerical Jacobian
end
SigmaPar = inv(J.'*inv(Sigma_obs)*J); % Estimate of parameter covariance
The way I understand your problem should be similar enough to this.
HTH
0 comentarios
Ver también
Categorías
Más información sobre Ordinary Differential Equations en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!