How to calculate the coefficient of determination R^2 of a Neural Network?

323 visualizaciones (últimos 30 días)
I want to calculate the coefficient of determination R^2 of a Neural Network by myself.
This is the Regression plot that Neural Network Training Tool:
but I want to calculate it in a way so I can "confirm" what I see on NN Training Tool.
As you can see below I have plot the Target (X) and the Prediction (Y) as Y = A*X
but the Regression Plot is way different, Prediction (Y) = 0.99*Target+0.0044 as Y=A*X+B
I understand that Weights and Biases are A and B respectively, but how can I find it and do it by myself, since they are Weights on Input Layer and Hidden Layer as well.
Also how can I draw the line that represents the middle of my data points?
figure;
plot(Output_Train,pFNN40_Train,'x');
title('Coefficient of Determination R^2');
legend('Train');
xlabel('Target');
ylabel('Prediction');
axis auto;
grid on;

Respuesta aceptada

the cyclist
the cyclist el 4 de Dic. de 2022
The formula is in the documentation here (for fitlm).It is not always the best goodness-of-fit measure for all models, but you should always be able to calculate it like this:
% Some pretend predicted and actual Y values
y_actual = [2.1; 3.2; 5.3; 7.1; 11.9];
y_predicted = [2.9; 2.7; 5.0; 7.2; 11.1];
% Plot them
plot(y_actual,y_predicted,'o')
% Sum of squared residuals
SSR = sum((y_predicted - y_actual).^2);
% Total sum of squares
TSS = sum(((y_actual - mean(y_actual)).^2));
% R squared
Rsquared = 1 - SSR/TSS
Rsquared = 0.9726

Más respuestas (1)

Joan M. Maura
Joan M. Maura el 14 de Feb. de 2023
Does the R in the regression plots for validation, test, training and all mean R^2 or do I have to square those results?
  1 comentario
the cyclist
the cyclist el 14 de Feb. de 2023
Editada: the cyclist el 14 de Feb. de 2023
In general, and certainly in any output from MathWorks, I would definitely not expect someone to write R when they mean R^2.
Also, you need to be cautious that the coefficient of determination (even though it is often call R^2) is not always equal to the correlation coefficient (R) squared. You can even bulding models in which R^2 is negative.
If you need R^2, you need to calculate it, not just square R.
Here is a contrived example, to illustrate the point:
x = [1.0; 2.0; 3.0; 4.0; 5.0; 6.0];
y_actual = [2.5; 4.1; 7.0; 7.7; 10.7; 12.2];
y_predicted = [2; 3; 4; 5; 6; 7]; % I've chosen these terrible prediction on purpose, to illustrate my point
figure
hold on
plot(x,y_actual,".","MarkerSize",24)
plot(x,y_predicted,".-","MarkerSize",8)
legend(["data","prediction"],"Location","NorthWest")
correlation = corr(x,y_predicted)
correlation = 1
SSR = sum((y_predicted - y_actual).^2);
TSS = sum(((y_actual - mean(y_actual)).^2));
Rsquared = 1 - SSR/TSS
Rsquared = 0.0318
Notice that the correlation is perfect, while the R^2 is terrible. If I had my the prediction smaller (moving them all down by, say 2 units, the R^2 would have gone negative).

Iniciar sesión para comentar.

Categorías

Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by