How to set maximum number of iterations in bayesopt

12 visualizaciones (últimos 30 días)
Shreenath Krishnamurthy
Shreenath Krishnamurthy el 9 de Nov. de 2024
Comentada: Umar el 12 de Nov. de 2024
Hi,
I am trying to minimize an objective using bayesopt function. In my code, i set maximum function evaluations to 1000. But i see that the optimization is completed after 147 iterations. How do i set my function to run for the maximum iterations. Does bayesopt have it's own stopping critetrion for a tolerance ?
Screenshot attached for now. happy to share the code if needed.
thanks,
img = imread('bayesopt.png');
image(img);

Respuesta aceptada

Umar
Umar el 10 de Nov. de 2024

Hi @Shreenath Krishnamurthy,

The bayesopt function in MATLAB is designed for Bayesian optimization of hyperparameters, often employed in machine learning scenarios. While you can set a maximum number of evaluations using the MaxObjectiveEvaluations option, bayesopt may terminate early based on its internal criteria. These include convergence thresholds, which assess if additional evaluations are unlikely to yield significant improvements.To ensure that your optimization runs for the full 1000 iterations, consider the following steps, You have already specified MaxObjectiveEvaluations, but double-check that it is correctly implemented in your code. The optimization may stop early if it detects that further evaluations won't improve the results significantly. This behavior is controlled by parameters like AcquisitionFunctionName and ExplorationRatio. Adjusting these can influence how aggressively the algorithm explores the search space. Set the Verbose option to a higher level (e.g., 2) to get detailed output during each iteration, which can provide insights into why it stopped early. Here is an example code snippet that incorporates your requirements and ensures you utilize all available iterations:

% Load data
load ionosphere;
% Define optimizable variables
num = optimizableVariable('n', [1, 30], 'Type', 'integer'); 
% Number of neighbors
dst = optimizableVariable('dst', {'chebychev', 'euclidean', 'minkowski'},   'Type', 'categorical'); % Distance metrics
% Create cross-validation partition
c = cvpartition(Y, 'Kfold', 5); % Use Y for the number of observations
% Define objective function
fun = @(x) kfoldLoss(fitcknn(X, Y, 'CVPartition', c, ...
  'NumNeighbors', x.n, 'Distance', char(x.dst), 'NSMethod', 'exhaustive'));
% Run Bayesian Optimization
results = bayesopt(fun, [num, dst], ...
  'Verbose', 2, ... % Increased verbosity for debugging
  'AcquisitionFunctionName', 'expected-improvement-plus', ... 
  'MaxObjectiveEvaluations', 1000, ... % Set max evaluations
  'IsObjectiveDeterministic', true); % Assuming your function is deterministic
% Display results
disp(results);

The load ionosphere; command imports the ionosphere dataset, which is commonly used for binary classification tasks. The optimizableVariable function defines the parameters that will be optimized. Here, num represents the number of neighbors, and dst represents the distance metrics. The cvpartition function creates a partition for K-fold cross-validation, which helps in assessing the model's performance more reliably. The anonymous function fun computes the K-fold loss for the KNN model using the specified parameters. The fitcknn function is used to fit the KNN model, and kfoldLoss evaluates its performance. The bayesopt function runs the optimization process, where it iteratively evaluates the objective function to find the optimal parameters. Finally, the results of the optimization are displayed using disp(results);, which provides insights into the best parameters found. So, the provided MATLAB code effectively demonstrates how to optimize the parameters of a K-Nearest Neighbors classifier using Bayesian optimization. By systematically adjusting the number of neighbors and the distance metric, the model's performance can be significantly improved.

Please see attached.

By default, bayesopt stops when it determines that further evaluations would not yield a better result than what has already been found. Now, if you want more control over this behavior, consider implementing custom stopping criteria or adjusting your acquisition function settings.

In case you’re running evaluations in parallel (UseParallel, true), ensure that your environment supports parallel computing and that it’s configured correctly and if the optimization still stops early despite setting parameters correctly, experiment with different acquisition functions or modify parameters such as ExplorationRatio.

By carefully tuning these settings and understanding how bayesopt operates under the hood, you should be able to achieve your desired number of evaluations while effectively minimizing your objective function.

  2 comentarios
Shreenath Krishnamurthy
Shreenath Krishnamurthy el 12 de Nov. de 2024
Thank you. this helps. i also also noticed that i needed to provide a longer MaxTime.
Umar
Umar el 12 de Nov. de 2024
Hi @ Shreenath Krishnamurthy,
Thank you for your feedback. I appreciate your acknowledgment of the information provided.

Iniciar sesión para comentar.

Más respuestas (0)

Categorías

Más información sobre Support Vector Machine Regression en Help Center y File Exchange.

Etiquetas

Productos


Versión

R2022b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by