fitcsvm cross-validation

2 visualizaciones (últimos 30 días)
João Mendes
João Mendes el 15 de Abr. de 2021
Comentada: João Mendes el 16 de Abr. de 2021
Hi, I am training a SVM classifier with the following code:
SVM_1=fitcsvm(X_train, y_train, 'OptimizeHyperparameters', 'all','HyperparameterOptimizationOptions',struct('Optimizer','bayesopt','AcquisitionFunctionName','expected-improvement-per-second-plus','Kfold',10,'ShowPlots',0));
I was wondering if there is any possibility to retrieve a performance metric of the classifier from the cross-validation - since I specify it as a 10-fold cross-validation (AUC, for example).
Thank you,
J

Respuesta aceptada

Alan Weiss
Alan Weiss el 16 de Abr. de 2021
As shown in this doc example, the cross-validation loss is reported at the command line and plotted by default (I see that you turned off the plot). Is there something else that you need, or did I misunderstand you?
Alan Weiss
MATLAB mathematical toolbox documentation
  3 comentarios
Alan Weiss
Alan Weiss el 16 de Abr. de 2021
The "Objective" in the iterative display (the generated table of iterations) is the cross-validation loss. The "Best so far" is simply the minimum objective up to that iteration. There is a difference between the "best so far" estimated and observed; that is a function of the model that the solver is estimating, and that changes every iteration. The model is that the observations themselves are noisy, so simply observing a value doesn't mean that observing it again will give the same response.
In a nutshell, I think that the iterative display gives you the information you seek.
Alan Weiss
MATLAB mathematical toolbox documentation
João Mendes
João Mendes el 16 de Abr. de 2021
Thank you very much.

Iniciar sesión para comentar.

Más respuestas (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by