Numerical Approximation of Fisher Information Matrix

18 visualizaciones (últimos 30 días)
Edoardo Briganti
Edoardo Briganti el 25 de Oct. de 2020
Comentada: iust el 15 de Nov. de 2020
Dear all,
I need to calculate the asymptotic standard errors of my maximum likelihood estimates. In order to do this, I have to calculate the Fisher Information Matrix. Analyticial derivation of it is possible, however it is a total pain of matrix algebra and matrix differentiation which I don't want to redo whenever I tweak my model and, in turn, my likelihood.
Is there a way to numerically approximate the Fisher Information matrix evaluated at my maximum likelihood estimates?
% Define Concentrated Log-Likelihood:
Log_L = @(r) log_likelihood(r,T,n,dy,d1,d2,W1,W2,X,6); % r is a vector of 2 parameters: r(1) and r(2).
% Maximum Likelihood Estimation (Constrained bivariate optimization)
options = optimoptions('fmincon','display','none');
x0 = .1 + .9*rand(1,2);
lb = [lb1,lb2];
ub = [ub1,ub2];
[sol,~,~,~,~,~,H] = fmincon(Log_L,x0,[],[],[],[],lb,ub,[],options);
% Maximum Likelihood Estimates of parameters which have been concentrated with respect to r(1) and r(2)
[~,hat_beta,hat_omega] = log_likelihood(sol,T,n,dy,d1,d2,W1,W2,X,k);
function [Log_L,beta_wls,omega] = log_likelihood(r,T,n,dy,d1,d2,W1,W2,X,k)
D1 = kron(d1,ones(n,1));
D2 = kron(d2,ones(n,1));
t1 = sum(d1);
t2 = sum(d2);
Z = dy - r(1)*D1.*(kron(eye(T),W1)*dy) - r(2)*D2.*(kron(eye(T),W2)*dy);
beta_ols = X\Z;
eps_ols = Z - X*beta_ols;
omega = diag( 1/(T-k).*sum(reshape(eps_ols,[n,T]).*reshape(eps_ols,[n,T]),2) ); % n parameters to estimate
sigma = kron(eye(T),omega);
beta_wls = ( X'*(sigma\X) )\( X'*( sigma\Z) );
eps_wls = Z - X*beta_wls;
H1 = inv(eye(n) - r(1)*W1);
H2 = inv(eye(n) - r(2)*W2);
% Concentrate (negative) log-Likelihood with respect to r(1) and r(2) (2 parameters to estimate)
Log_L = -( - T/2*log(det(omega)) - t1*log(det(H1)) ...
- t2*log(det(H2)) - 0.5*eps_wls'*(sigma\eps_wls) ) ;
end
You can find my code above. I concentrate my likelihood with respect to 2 parameters. The other parameters are contained in the vector beta_wls (total of n+6 parameters) and in the diagonal matrix omega (n variances).
Thanks in advance,
Best,
Edoardo
  1 comentario
iust
iust el 15 de Nov. de 2020
Hi,
I have the same problem and I don't understand what your codes exactly say. My system of equations are dynamic and I use Kalman filter for estimation and then Fisher info matrix becomes a little crazy. Can you help me to find the Fisher info for a dynamic system?
I just added one question, sorry.
Thanks,
Adel

Iniciar sesión para comentar.

Respuesta aceptada

John D'Errico
John D'Errico el 25 de Oct. de 2020
Editada: John D'Errico el 25 de Oct. de 2020
You cannot perform numerical differentiation? At the very least, you can use tools for numerical differentiation from my derivest set of tools. There is a Hessian matrix tool in there.
  1 comentario
Edoardo Briganti
Edoardo Briganti el 25 de Oct. de 2020
Editada: Edoardo Briganti el 25 de Oct. de 2020
Hi John,
thanks for your prompt reply. I might use your package to calculate the hessian of my log-likelihood at the values taken by the maximum likelihood estimates. However, the Fisher Information Matrix requires to take expectations of this object: .
Would not this be a problem? Would the numerical approximation of the Hessian be enough to approximate the Fisher Information Matrix?
Best,
Edoardo

Iniciar sesión para comentar.

Más respuestas (0)

Categorías

Más información sobre Online Estimation en Help Center y File Exchange.

Etiquetas

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by