hyperparameterOptimizationOptions
Description
The hyperparameterOptimizationOptions
function creates a
HyperparameterOptimizationOptions
object, which contains options for
hyperparameter optimization of machine learning models. You can also use the object to
configure multiple hyperparameter optimization problems that have the same option settings but
different constraint bounds.
The HyperparameterOptimizationOptions
function can be called directly. Also, you can
specify hyperparameter optimization settings, such as the constraint type and constraint
target. After you create a HyperparameterOptimizationOptions
object, you can
pass it to a fitting function that supports hyperparameter optimization by specifying the
HyperparameterOptimizationOptions
name-value argument. For a list of
supported fitting functions, see AggregateBayesianOptimization
.
Creation
Create a HyperparameterOptimizationOptions
model object
hpoOptions
by calling the function directly.
Description
sets properties using
one or more name-value arguments. For example,
hpoOptions
= hyperparameterOptimizationOptions(Name=Value
)hyperparameterOptimizationOptions(Optimizer="gridsearch",NumGridDivisions=20)
specifies to use grid search optimization with 20 grid values per dimension.
Properties
You can set most properties by using name-value argument syntax when you call
hyperparameterOptimizationOptions
directly. For example,
hyperparameterOptimizationOptions(ConstraintBounds=[1000; 5000;
10000],ConstraintType="size")
specifies to solve three optimization problems with
constraint bounds of 0
to 1000
, 0
to
5000
, and 0
to 10000
bytes on the
compact model size using default values for the other properties. You cannot set the
properties Objective
and
Constraint
. After
you create a HyperparameterOptimizationOptions
object, you can modify most
of its properties using dot notation. For example, object.Verbose=0
specifies to suppress the display of command-line output.
Constraint Properties
Constraint
— Constraint function
"None"
(default) | "kfoldLoss"
| "CompactModelSize"
| "LearnerForCoderSize"
This property is read-only.
Constraint function, specified as a string. The values of
Constraint
and Objective
depend on the values of the ConstraintTarget
and ConstraintType
properties.
Value of ConstraintTarget | Value of ConstraintType | Value of Constraint | Value of Objective |
---|---|---|---|
[] , "matlab" , or
"coder" | [] | "None" | "kfoldLoss" |
"matlab" | "size" | "CompactModelSize" | "kfoldLoss" |
"matlab" | "loss" | "kfoldLoss" | "CompactModelSize" |
"coder" | "size" | "LearnerForCoderSize" | "kfoldLoss" |
"coder" | "loss" | "kfoldLoss" | "LearnerForCoderSize" |
Data Types: string
ConstraintBounds
— Constraint bounds
[]
(default) | N-by-2 numeric matrix | numeric vector of length N
Constraint bounds for N optimization problems, specified as an
N-by-2 numeric matrix, a numeric vector of length
N, or []
. The columns of
ConstraintBounds
contain the lower and upper bound values of
the optimization problems. If you specify ConstraintBounds
as a
numeric vector, the software converts ConstraintBounds
to a
matrix, and assigns the input values to the second column, and zeros to the first
column. If you specify ConstraintBounds
when you create
hpoOptions
, you must also specify ConstraintType
.
Data Types: single
| double
ConstraintTarget
— Constraint target
[]
(default) | "matlab"
| "coder"
Constraint target, specified as []
,
"matlab"
, or "coder"
. The values of
ConstraintTarget
and ConstraintType
determine the objective and constraint functions for the
optimization problems. If you specify ConstraintBounds
and ConstraintType
when you create
hpoOptions
, then the default value of
ConstraintTarget
is "matlab"
. Otherwise, the
default value is []
. If both ConstraintBounds
and ConstraintType
are []
and you set
ConstraintTarget
, the software sets
ConstraintTarget
to []
.
Data Types: char
| string
ConstraintType
— Constraint type
[]
(default) | "size"
| "loss"
Constraint type, specified as []
, "size"
, or
"loss"
. The values of ConstraintType
and
ConstraintTarget
determine the objective and constraint functions for
the optimization problems. If you specify ConstraintType
when you
create hpoOptions
, you must also specify ConstraintBounds
.
Data Types: char
| string
Optimization Properties
Objective
— Objective function
"kfoldLoss"
| "CompactModelSize"
| "LearnerForCoderSize"
This property is read-only.
Objective function, specified as a string. The values of
Objective
and Constraint
depend on the values of the ConstraintTarget
and ConstraintType
properties.
Value of ConstraintTarget | Value of ConstraintType | Value of Constraint | Value of Objective |
---|---|---|---|
[] , "matlab" , or
"coder" | [] | "None" | "kfoldLoss" |
"matlab" | "size" | "CompactModelSize" | "kfoldLoss" |
"matlab" | "loss" | "kfoldLoss" | "CompactModelSize" |
"coder" | "size" | "LearnerForCoderSize" | "kfoldLoss" |
"coder" | "loss" | "kfoldLoss" | "LearnerForCoderSize" |
Data Types: string
Optimizer
— Optimization algorithm
"bayesopt"
(default) | "gridsearch"
| "randomsearch"
| "asha"
Optimization algorithm, specified as one of the values in this table.
Value | Description |
---|---|
"bayesopt" (default) | Use Bayesian optimization. |
"gridsearch" | Use grid search with NumGridDivisions values per dimension. The software searches
in a random order, using uniform sampling without replacement from the grid.
The fitrauto and fitcauto fitting functions do not support grid search. |
"randomsearch" | Search at random among MaxObjectiveEvaluations points. The
fitrauto and fitcauto fitting
functions do not support random search. |
"asha" | Use asynchronous successive halving algorithm (ASHA) optimization. Only
the fitrauto and fitcauto fitting
functions support ASHA optimization. |
For more information on optimization algorithms, see the documentation pages of the individual fitting functions.
Data Types: char
| string
AcquisitionFunctionName
— Type of acquisition function
"expected-improvement-per-second-plus"
(default) | "expected-improvement"
| "expected-improvement-plus"
| "expected-improvement-per-second"
| "lower-confidence-bound"
| "probability-of-improvement"
Type of acquisition function, specified as one of the listed values. When you pass
hpoOptions
to fitcauto
or
fitrauto
,
the software uses the acquisition function
"expected-improvement"
.
Acquisition functions whose names include
per-second
do not yield reproducible results, because the
optimization depends on the run time of the objective function. Acquisition functions
whose names include plus
modify their behavior when they
overexploit an area. For more details, see Acquisition Function Types.
Data Types: char
| string
MaxObjectiveEvaluations
— Maximum number of objective function evaluations
positive integer | []
Maximum number of objective function evaluations for each optimization problem,
specified as a positive integer or []
. If
MaxObjectiveEvaluations
is []
, then the
default value depends on the fitting function to which you pass
hpoOptions
. For more information, see the
HyperparameterOptimizationOptions
name-value argument description
on the fitting function documentation page.
Data Types: single
| double
MaxTime
— Time limit
"Inf"
(default) | nonnegative numeric scalar
Time limit for each optimization problem, specified as a nonnegative numeric
scalar or "Inf"
. The time limit is in seconds, as measured by
tic
and toc
. The software performs at least one iteration for each optimization
problem, regardless of the value of MaxTime
. The run time can
exceed MaxTime
because MaxTime
does not
interrupt function evaluations.
Data Types: single
| double
| char
| string
MinTrainingSetSize
— Minimum number of observations in each training set
[]
(default) | positive integer
Minimum number of observations in each training set, specified as a positive
integer or []
. This property is ignored by all fitting functions
except fitcauto
and fitrauto
.
You can set MinTrainingSetSize
only when Optimizer
is "asha"
. If Optimizer
is
"asha"
and MinTrainingSetSize
is
[]
, then fitcauto
and
fitrauto
use a MinTrainingSetSize
value of
100
.
Tip
For reliable results, specify MinTrainingSetSize
as a value
of 100
or greater.
Data Types: single
| double
MaxTrainingSetSize
— Maximum number of observations in each training set
[]
(default) | positive integer
Maximum number of observations in each training set, specified as a positive
integer or []
. This property is ignored by all fitting functions
except fitcauto
and fitrauto
,
and matches the largest training set size. You can set
MaxTrainingSetSize
only when Optimizer
is "asha"
. If MaxTrainingSetSize
is
[]
, the fitcauto
and
fitrauto
functions use a default value that depends on the type
of cross-validation. For more information, see the
HyperparameterOptimizationOptions
name-value argument description
on the fitting function documentation page.
Tip
For reliable results, specify MaxTrainingSetSize
as a value
of 100
or greater.
Data Types: single
| double
NumGridDivisions
— Number of grid divisions in grid search
vector of positive integers | positive integer | []
Number of grid divisions in each grid search, specified as a vector of positive
integers giving the number of values for each dimension, a positive integer that
applies to all dimensions, or []
. You can set
NumGridDivisions
only when Optimizer
is "gridsearch"
.
If you specify
Optimizer
="gridsearch"
when you createhpoOptions
, the default value ofNumGridDivisions
is10
. Otherwise, the default value is[]
.If you set
Optimizer
as"gridsearch"
using dot notation, the software setsNumGridDivisions
to10
.
The fitrauto
and fitcauto
fitting functions do not support grid search.
Data Types: char
| string
UseParallel
— Flag to run in parallel
false
or 0
(default) | true
or 1
Flag to run in parallel, specified as a numeric
or logical 1
(true
) or 0
(false
). If the value of UseParallel
is
true
and you pass hpoOptions
to a fitting
function, the software executes for
-loop iterations by using
parfor
. The loop runs in parallel when
you have Parallel Computing Toolbox™.
Note
Due to the nonreproducibility of parallel timing, parallel optimization does not necessarily yield reproducible results.
Data Types: logical
Cross-Validation Properties
CVPartition
— Cross-validation partition
[]
(default) | cvpartition
object
Cross-validation partition, specified as a cvpartition
object
created using cvpartition
, or as []
.
The partition object specifies the type of cross-validation, as well as the indexing
for the training and validation sets.
If you specify CVPartition
when you create
hpoOptions
, you cannot specify Holdout
or
KFold
. When
you set CVPartition
using dot notation, the software sets
Holdout
=[]
and
KFold
=[]
.
Holdout
— Fraction of data for holdout validation
[]
(default) | numeric scalar in the range (0,1)
Fraction of data used for holdout validation, specified as a numeric scalar in the
range (0,1)
, or as []
. Holdout validation tests
the specified fraction of the data, and uses the rest of the data for training.
If you specify Holdout
when you create
hpoOptions
, you cannot specify CVPartition
or KFold
. When
you set Holdout
using dot notation, the software sets
CVPartition
=[]
and
KFold
=[]
.
Data Types: single
| double
KFold
— Number of folds
integer greater than 1
| []
Number of folds to randomly partition the data into during optimization, specified
as an integer greater than 1
, or as []
. For each
fold, the software reserves the fold as test data, and trains the model using the
other k
– 1 folds.
When you create hpoOptions
:
If you specify
KFold
, you cannot specifyCVPartition
orHoldout
.If you specify
Optimizer
, then the default value ofKFold
is5
.If you specify
CVPartition
orHoldout
, then the value ofKFold
is[]
.
When you set KFold
using dot notation, the software sets
CVPartition
=[]
and
Holdout
=[]
.
Data Types: single
| double
Repartition
— Flag to repartition cross-validation
false
or 0
(default) | true
or 1
Flag to repartition the cross-validation at every optimization iteration,
specified as a numeric or logical 0
(false
) or
1
(true
). If Repartition
is false
(the default), the software uses a single partition for
each optimization.
Specifying Repartition
=true
usually gives
the most robust results, because this setting takes partitioning noise into account.
However, for optimal results, this setting requires at least twice as many function
evaluations.
Data Types: logical
Output Properties
SaveIntermediateResults
— Flag to save intermediate results
false
or 0
(default) | true
or 1
Flag to save intermediate results during hyperparameter optimization when
Optimizer
="bayesopt"
, specified as a numeric or
logical 0
(false
) or 1
(true
). If SaveIntermediateResults
is
true
and Constraint
is a value other than "None"
, then the software overwrites a
workspace variable at each optimization iteration. The name and type of the workspace
variable depend on the fitting function to which you pass
hpoOptions
. For more information, see the
HyperparameterOptimizationOptions
name-value argument description
on the fitting function documentation page.
Data Types: logical
ShowPlots
— Flag to show plots
true
or 1
(default) | false
or 0
Flag to show plots during optimization, specified as a numeric or logical
1
(true
) or 0
(false
). The types of plots depend on the fitting function to
which you pass hpoOptions
. For more information, see the
HyperparameterOptimizationOptions
name-value argument description
on the fitting function documentation page.
Data Types: logical
Verbose
— Command-line display level
1
(default) | 0
| 2
Command-line display level, specified as a 0
,
1
, 2
. When you pass
hpoOptions
to a fitting function, the software behaves as
follows, depending on the value of Verbose
:
0
— No iterative display1
— Iterative display2
— Iterative display with additional information
The type of command-line display output depends on the fitting function to which
you pass hpoOptions
. For more information on the display output
for the automated model selection functions, see the Verbose Display section of
fitcauto
and fitrauto
.
For all other fitting functions, see the bayesopt
Verbose
name-value argument.
Data Types: single
| double
Examples
Hyperparameter Optimization with Multiple Constraint Bounds
Repeat the hyperparameter optimization of a trained classification tree model several times using the same optimization settings, but with different constraint bounds each time.
Load the ionosphere data set.
load ionosphere.mat
Create a HyperparameterOptimizationOptions
object that contains the settings for three hyperparameter optimization problems. For each problem, specify to use the compact size of the trained model as the constraint, and use the default settings for the other optimization options. Specify the constraint bounds as 0
to 5000
bytes for the first problem, 0
to 20000
bytes for the second problem, and 0
to 50000
bytes for the third problem.
rng(0,"twister"); % For reproducibility hpoOptions = hyperparameterOptimizationOptions(ConstraintType="size", ... ConstraintBounds=[5000; 20000; 50000])
hpoOptions = HyperparameterOptimizationOptions with properties: Objective: "kfoldLoss" Constraint: "CompactModelSize" AcquisitionFunctionName: "expected-improvement-per-second-plus" ConstraintBounds: [3x2 double] ConstraintTarget: "matlab" ConstraintType: "size" KFold: 5 MaxTime: Inf Optimizer: "bayesopt" Repartition: 0 SaveIntermediateResults: 0 ShowPlots: 1 UseParallel: 0 Verbose: 1
hpoOptions
is a HyperparameterOptimizationOptions
object that contains hyperparameter optimization options for the classification tree fitting function.
Modify the default property values in hpoOptions
to suppress the display of plots and command-line output.
hpoOptions.ShowPlots=false; hpoOptions.Verbose=0;
Train a classification tree model using the fitctree
function, and optimize the model hyperparameters for each optimization problem subject to the constraints in hpoOptions
. Additionally return the results object hpoResults
.
[Mdl,hpoResults]=fitctree(X,Y,OptimizeHyperparameters="auto", ... HyperparameterOptimizationOptions=hpoOptions)
Mdl=3×1 cell array
{1x1 ClassificationTree}
{1x1 ClassificationTree}
{1x1 ClassificationTree}
hpoResults = AggregateBayesianOptimization with properties: Objective: "kfoldLoss" Constraint: "CompactModelSize" BayesianOptimizationResults: {3x1 cell} ConstraintAtMinObjective: [3x1 double] ConstraintBounds: [3x2 double] ConstraintBoundsAreSatisfied: [3x1 logical] ConstraintTarget: "matlab" ConstraintType: "size" Feasible: [3x1 logical] HyperparameterOptimizationResults: {3x1 cell} LearnerAtMinObjective: [3x1 string] MinObjective: [3x1 double] VariableDescriptions: {3x1 cell}
Mdl
is a cell array that contains the trained classification model object for each optimization problem. hpoResults
is an AggregateBayesianOptimization
object that contains the options and results for each optimization problem.
Alternative Functionality
You can also perform hyperparameter optimization with the same set of available options by
passing a structure to a fitting function that supports hyperparameter optimization. To do so,
specify the HyperparameterOptimizationOptions
name-value argument of the
function. For more information, see the Name-Value Arguments section of the individual fitting
function pages.
Version History
Introduced in R2024b
MATLAB Command
You clicked a link that corresponds to this MATLAB command:
Run the command by entering it in the MATLAB Command Window. Web browsers do not support MATLAB commands.
Select a Web Site
Choose a web site to get translated content where available and see local events and offers. Based on your location, we recommend that you select: .
You can also select a web site from the following list
How to Get Best Site Performance
Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.
Americas
- América Latina (Español)
- Canada (English)
- United States (English)
Europe
- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)
- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)
Asia Pacific
- Australia (English)
- India (English)
- New Zealand (English)
- 中国
- 日本Japanese (日本語)
- 한국Korean (한국어)