SVM training with hyperparameter "CVPartition"

6 views (last 30 days)
I recently came across a variety of hyperparameters while training an SVM. Among them, I am curious about the relationship between the parameters "CVPartition" and "OptimizeHyperparameters".
  1. Is CVPartition applied only when 'OptimizeHyperparameters' is set to 'none'?
  2. When 'OptimizeHyperparameters' set to 'auto' , the code is executed. So, does cross validation not proceed?
  3. Even if you set and run 10fold, there is only one model with the classificationSVM property that is provided as an output. Is this one model applied as a parameter with the best performance among the ten models?
Here's my code, I'd appreciate it if you could tell me about the problem.
classificationSVM = fitcsvm(...
Md_train_data, ... % training set
Label, ... % training set label
'KernelFunction', 'gaussian', ...
'PolynomialOrder', [], ...
'KernelScale', 'Auto', ...
'BoxConstraint', 1, ...
'Standardize', true, ...
'ClassNames', [1; 2], ...
'OptimizeHyperparameters','none', ... % Here is points
'HyperparameterOptimizationOptions',struct('Optimizer','gridsearch','NumGridDivisions',10,...
'AcquisitionFunctionName','expected-improvement-plus','ShowPlots',false,...
'CVPartition', cvpartition(Label, 'kfold', 10))); % and here is point

Accepted Answer

Alan Weiss
Alan Weiss on 16 Jun 2022
Edited: Alan Weiss on 16 Jun 2022
I think that you have a misunderstanding about what these options do. Look in the first paragraph of the documentation of HyperparameterOptimizationOptions for fitcsvm: "This argument modifies the effect of the OptimizeHyperparameters name-value argument." This means, among other things, if you set 'OptimizeHyperparameters','none' as you suggest, then it doesn't matter what you set for HyperparameterOptimizationOptions because there will be no optimization.
I am not sure that I understand your second question. If you set 'OptimizeHyperparameters','auto' then the optimized hyperparameters are {'BoxConstraint','KernelScale'}, as documented. As the documentation states, the thing being optimized the the cross-validation loss: "The optimization attempts to minimize the cross-validation loss (error) for fitcsvm by varying the parameters." So there is indeed cross-validation in this case.
For your third question, I think you have a misunderstanding about what is reported back. After the optimization is completed and the optimal hyperparameters are found, fitcsvm uses those optimal hyperparameters and fits a classifier. You do not get the results as a cross-validation (ClassificationPartitionedModel) object, but as a classification object. OK?
Alan Weiss
MATLAB mathematical toolbox documentation
  3 Comments
hobin Hwang
hobin Hwang on 19 Jun 2022
Thanks for your kind reply. Your advice has been helpful.

Sign in to comment.

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by