Fit nonlinear regression model

1 visualización (últimos 30 días)
Hidd_1
Hidd_1 el 19 de Mayo de 2023
Comentada: Hidd_1 el 19 de Mayo de 2023
I am trying the fit the eclosed data,
for the X-Axis:
x = 0:1:248;
I enclosed the data for the Y-axis. (Each row is a curve, there are 5 row = 5 curves)
with the following objective function:
here is my code:
DF = load('Inter_cubic.mat');
Data = DF.Inter_cubic(:,1:230);
Data1 = array2table(Data);
x = 1:size(Data,2);
Sig = @(p,x) p(4)./(1 + exp(-p(1).*x + p(7))) + p(5)./(1 + exp(-p(2).*x + p(8))) + p(6)./(1 + exp(-p(3).*x + p(9))) + p(10);
beta0 = [Data(1,1) Data(2,1) Data(3,1) Data(4,1) Data(5,1)];
for k = 1:size(Data1,1)
mdl(1,:) = fitnlm(Data1(k,:),Sig,beta0(1,k));
end
I am getting error message regarding using the function "fitnlm", and regarding the initial value beta0 are they the intial values of the data?

Respuesta aceptada

the cyclist
the cyclist el 19 de Mayo de 2023
Answering your main question: beta0 is the initial guess at the coefficients of the fit. In your case, MATLAB is expecting a vector of length 10, because you have 10 parameters to fit.
There are lots of problems with both your MATLAB syntax, and how you are trying to fit your curves. In the code below, I have fixed the syntax problems, by
  • fixing the beta0 problem you asked about
  • creating separate tables for each fit
  • storing the resulting models in a cell array
But, I did not try to fix the equation you are trying to fit to.
load Data
x = (1:size(Data,2))';
% Define the fitting function
Sig = @(p,x) p(4)./(1 + exp(-p(1).*x + p(7))) + p(5)./(1 + exp(-p(2).*x + p(8))) + p(6)./(1 + exp(-p(3).*x + p(9))) + p(10);
% Initial guess of coefficients
beta0 = ones(1,10);
for k = 1:size(Data,1)
% Put the data for this curve into a table
y = Data(k,:)';
tbl = table(x,y);
% Fit the model
mdl{k} = fitnlm(tbl,Sig,beta0);
% Plot the fit against the data
figure
hold on
plot(x,Data(k,:),'o')
plot(x,predict(mdl{k},x))
end
Warning: Rank deficient, rank = 4, tol = 8.082340e-13.
Warning: Rank deficient, rank = 4, tol = 8.081976e-13.
Warning: Some columns of the Jacobian are effectively zero at the solution, indicating that the model is insensitive to some of its parameters. That may be because those parameters are not present in the model, or otherwise do not affect the predicted values. It may also be due to numerical underflow in the model function, which can sometimes be avoided by choosing better initial parameter values, or by rescaling or recentering. Parameter estimates may be unreliable.
Warning: Rank deficient, rank = 4, tol = 8.082340e-13.
Warning: Rank deficient, rank = 4, tol = 8.081976e-13.
Warning: Some columns of the Jacobian are effectively zero at the solution, indicating that the model is insensitive to some of its parameters. That may be because those parameters are not present in the model, or otherwise do not affect the predicted values. It may also be due to numerical underflow in the model function, which can sometimes be avoided by choosing better initial parameter values, or by rescaling or recentering. Parameter estimates may be unreliable.
Warning: Rank deficient, rank = 4, tol = 8.082340e-13.
Warning: Rank deficient, rank = 4, tol = 8.081976e-13.
Warning: Some columns of the Jacobian are effectively zero at the solution, indicating that the model is insensitive to some of its parameters. That may be because those parameters are not present in the model, or otherwise do not affect the predicted values. It may also be due to numerical underflow in the model function, which can sometimes be avoided by choosing better initial parameter values, or by rescaling or recentering. Parameter estimates may be unreliable.
Warning: Rank deficient, rank = 4, tol = 8.082340e-13.
Warning: Rank deficient, rank = 4, tol = 8.081976e-13.
Warning: Some columns of the Jacobian are effectively zero at the solution, indicating that the model is insensitive to some of its parameters. That may be because those parameters are not present in the model, or otherwise do not affect the predicted values. It may also be due to numerical underflow in the model function, which can sometimes be avoided by choosing better initial parameter values, or by rescaling or recentering. Parameter estimates may be unreliable.
Warning: Rank deficient, rank = 4, tol = 2.680475e-12.
Warning: Rank deficient, rank = 4, tol = 1.142958e-12.
Warning: Rank deficient, rank = 4, tol = 8.476406e-13.
Warning: Rank deficient, rank = 4, tol = 8.122245e-13.
Warning: Some columns of the Jacobian are effectively zero at the solution, indicating that the model is insensitive to some of its parameters. That may be because those parameters are not present in the model, or otherwise do not affect the predicted values. It may also be due to numerical underflow in the model function, which can sometimes be avoided by choosing better initial parameter values, or by rescaling or recentering. Parameter estimates may be unreliable.
  5 comentarios
the cyclist
the cyclist el 19 de Mayo de 2023
I think a simplified model can still do a pretty good fit, if you choose better initial guesses. And you may need different initial guesses for each curve. For example, looking only at the 5th set of data ...
load Data
x = (1:size(Data,2))';
% Define the fitting function
% Sig = @(p,x) p(4)./(1 + exp(-p(1).*x + p(7))) + p(5)./(1 + exp(-p(2).*x + p(8))) + p(6)./(1 + exp(-p(3).*x + p(9))) + p(10);
Sig = @(p,x) p(2)./(1 + exp(-p(1).*x + p(3))) + p(4); % <----- I used only one non-linear function here, not all three
% Initial guess of coefficients
beta0 = [0.01 1 0 19] % <------- I changed these
beta0 = 1×4
0.0100 1.0000 0 19.0000
for k = 5 % <------- I am ONLY looking at 5th curve
% Put the data for this curve into a table
y = Data(k,:)';
tbl = table(x,y);
% Fit the model
mdl{k} = fitnlm(tbl,Sig,beta0);
% Plot the fit against the data
figure
hold on
plot(x,Data(k,:),'o')
plot(x,predict(mdl{k},x))
end
Hidd_1
Hidd_1 el 19 de Mayo de 2023
Thanks a lot!

Iniciar sesión para comentar.

Más respuestas (0)

Productos


Versión

R2020a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by