P-value for multivariate regression

2 visualizaciones (últimos 30 días)
Roni
Roni el 27 de Mayo de 2015
Comentada: Roni el 5 de Jul. de 2016
I am interested in using mvregress for multivariate regression (for example, let’s say I have [y1, y2, y3] and x). I was surprised to see that unlike the regress function, mvregress does not provide statistics such as r-squared or p-values. I therefore have two questions:
1. Is there any reason not to calculate these statistics? It seems to me like inconsistency between the two functions, so I was wondering if there is a good reason for me not to try calculating the p-value.
2. While looking for an answer, I found this: http://www.mathworks.com/matlabcentral/answers/108929-after-using-mvregress-how-can-i-find-the-rsquared-value-t-values-p-values-f-statistic-and-stand. As I understand, this code is helpful for the univariate case in which we have one Y (hence we can use beta(1)). I thought of using beta(1,:), but I wasn’t sure how to use the CovB. I am trying to write such code for the case of multiple y’s with different intercepts and slopes based on Hotelling's T-squared distribution but every time I succeed in solving one problem along the way, a different one pops.
I would very much appreciate any help here.
Thanks!
  2 comentarios
User110
User110 el 21 de Jun. de 2016
Hi Roni. I've encountered literally the same problem. I also have multiple response variables [y1 y2 y3] and I've become stumped where you are! I figured, as well, that beta(1,:) would be an appropriate adjustment to the code but can't for the life of me figure out how to make CovB compatible for its division with beta to give you the tratio (leading to the pvalue).
Have you gotten the solution?
Roni
Roni el 5 de Jul. de 2016
Well, it wasn’t easy, but after digging a lot I found some help in “Methods of Multivariate analysis” by Alvin C. Rencher, pages 373-375. There are some approximations that can be used for specific cases. I was only interested in one explanatory variable, so it worked for me. I really recommend checking this book, for it truly helped me to better understand this subject.
I eventually chose to work with likelihood ratio tests. As far as I understood, the degrees of freedom should be (num_variates)*(rank_explanatory), where num_variates is 3 for the case of [y1 y2 y3] and num_ explanatory is the rank of explanatory variables that are different between the full and the reduced model you use in the LR.
I hope this helps, good luck!

Iniciar sesión para comentar.

Respuestas (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by