SVM training with hyperparameter "CVPartition"

5 visualizaciones (últimos 30 días)
hobin Hwang
hobin Hwang el 16 de Jun. de 2022
Comentada: hobin Hwang el 19 de Jun. de 2022
I recently came across a variety of hyperparameters while training an SVM. Among them, I am curious about the relationship between the parameters "CVPartition" and "OptimizeHyperparameters".
  1. Is CVPartition applied only when 'OptimizeHyperparameters' is set to 'none'?
  2. When 'OptimizeHyperparameters' set to 'auto' , the code is executed. So, does cross validation not proceed?
  3. Even if you set and run 10fold, there is only one model with the classificationSVM property that is provided as an output. Is this one model applied as a parameter with the best performance among the ten models?
Here's my code, I'd appreciate it if you could tell me about the problem.
classificationSVM = fitcsvm(...
Md_train_data, ... % training set
Label, ... % training set label
'KernelFunction', 'gaussian', ...
'PolynomialOrder', [], ...
'KernelScale', 'Auto', ...
'BoxConstraint', 1, ...
'Standardize', true, ...
'ClassNames', [1; 2], ...
'OptimizeHyperparameters','none', ... % Here is points
'HyperparameterOptimizationOptions',struct('Optimizer','gridsearch','NumGridDivisions',10,...
'AcquisitionFunctionName','expected-improvement-plus','ShowPlots',false,...
'CVPartition', cvpartition(Label, 'kfold', 10))); % and here is point

Respuesta aceptada

Alan Weiss
Alan Weiss el 16 de Jun. de 2022
Editada: Alan Weiss el 16 de Jun. de 2022
I think that you have a misunderstanding about what these options do. Look in the first paragraph of the documentation of HyperparameterOptimizationOptions for fitcsvm: "This argument modifies the effect of the OptimizeHyperparameters name-value argument." This means, among other things, if you set 'OptimizeHyperparameters','none' as you suggest, then it doesn't matter what you set for HyperparameterOptimizationOptions because there will be no optimization.
I am not sure that I understand your second question. If you set 'OptimizeHyperparameters','auto' then the optimized hyperparameters are {'BoxConstraint','KernelScale'}, as documented. As the documentation states, the thing being optimized the the cross-validation loss: "The optimization attempts to minimize the cross-validation loss (error) for fitcsvm by varying the parameters." So there is indeed cross-validation in this case.
For your third question, I think you have a misunderstanding about what is reported back. After the optimization is completed and the optimal hyperparameters are found, fitcsvm uses those optimal hyperparameters and fits a classifier. You do not get the results as a cross-validation (ClassificationPartitionedModel) object, but as a classification object. OK?
Alan Weiss
MATLAB mathematical toolbox documentation
  3 comentarios
Walter Roberson
Walter Roberson el 17 de Jun. de 2022
Well you could do that, but you would have to do it "by hand" by calling the function a number of times with different parameters. 'none' on the optimization option means that cross-validation will not be done by the function.
hobin Hwang
hobin Hwang el 19 de Jun. de 2022
Thanks for your kind reply. Your advice has been helpful.

Iniciar sesión para comentar.

Más respuestas (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by