Borrar filtros
Borrar filtros

Importance of predictors within an Optimizable GPR model

7 visualizaciones (últimos 30 días)
lauzof
lauzof el 6 de Sept. de 2023
Comentada: lauzof el 20 de Sept. de 2023
Hello everyone,
related with a previous topic I posted (https://it.mathworks.com/matlabcentral/answers/2011417-regression-learner-app-relative-weights-of-variables#answer_1296851?s_tid=prof_contriblnk), I'd like to know if is it possible to find the predictor importance from my model at a global level ? something as shown in the first plot of here https://it.mathworks.com/help/stats/lime.plot.html but not for an individual queryPoint but the whole model
thanks again!
best,
Laura

Respuesta aceptada

Ive J
Ive J el 11 de Sept. de 2023
It's almost always good to look at both local and global levels. In case of lime or SHAP, you can calculate on a random subset of your training dataset, and take the average SHAP (this can be quite computationally heavy for GPR though).
numSample = 1000; % for 1000 randomly selected observations
shap_idx = randsample(1:height(x_train), numSample);
shap_train = x_train(shap_idx, 1:end-1); % x_train is a table, and target variable is the last col
explainer = shapley(yourGPRmodel, shap_train, UseParallel=true); % check other arguments, e.g. if categorical features are there
shap = zeros(size(shap_train, 1), size(shap_train, 2));
for k = 1:size(shap_train, 1)
explainer = fit(explainer, shap_train(k, :), UseParallel=true);
shap(k, :) = explainer.ShapleyValues.ShapleyValue;
end
shap_mean_abs = mean(abs(shap), 1)';
On a global level you can try
doc plotPartialDependence
Also check ICE plots for the above, for individual observations.

Más respuestas (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by