Ensembles de clasificación
Un ensemble de clasificación es un modelo predictivo compuesto por una combinación ponderada de varios modelos de clasificación. En general, la combinación de varios modelos de clasificación aumenta la capacidad predictiva.
Para explorar ensembles de clasificación de forma interactiva, utilice la app Classification Learner. Para mayor flexibilidad, utilice fitcensemble
en la interfaz de línea de comandos para potenciar o empaquetar árboles de clasificación o aumentar un bosque aleatorio [12]. Para obtener información sobre todos los ensembles compatibles, consulte Ensemble Algorithms. Para reducir un problema multiclase a un ensemble de problemas de clasificación binaria, entrene un modelo de códigos de salida de corrección de errores (ECOC, por sus siglas en inglés). Para obtener más detalles, consulte fitcecoc
.
Para potenciar árboles de regresión mediante LSBoost o aumentar un bosque aleatorio de árboles de regresión[12], consulte Ensembles de regresión.
Apps
Classification Learner | Entrenar modelos para clasificar datos usando machine learning supervisado |
Bloques
ClassificationEnsemble Predict | Classify observations using ensemble of decision trees (desde R2021a) |
ClassificationECOC Predict | Classify observations using error-correcting output codes (ECOC) classification model (desde R2023a) |
Funciones
Clases
Temas
- Train Ensemble Classifiers Using Classification Learner App
Create and compare ensemble classifiers, and export trained models to make predictions for new data.
- Framework for Ensemble Learning
Obtain highly accurate predictions by using many weak learners.
- Ensemble Algorithms
Learn about different algorithms for ensemble learning.
- Train Classification Ensemble
Train a simple classification ensemble.
- Test Ensemble Quality
Learn methods to evaluate the predictive quality of an ensemble.
- Handle Imbalanced Data or Unequal Misclassification Costs in Classification Ensembles
Learn how to set prior class probabilities and misclassification costs.
- Classification with Imbalanced Data
Use the RUSBoost algorithm for classification when one or more classes are over-represented in your data.
- LPBoost and TotalBoost for Small Ensembles
Create small ensembles by using the LPBoost and TotalBoost algorithms.
- Tune RobustBoost
Tune RobustBoost parameters for better predictive accuracy.
- Surrogate Splits
Gain better predictions when you have missing data by using surrogate splits.
- Train Classification Ensemble in Parallel
Train a bagged ensemble in parallel reproducibly.
- Bootstrap Aggregation (Bagging) of Classification Trees Using TreeBagger
Create a
TreeBagger
ensemble for classification. - Credit Rating by Bagging Decision Trees
This example shows how to build an automated credit rating tool.
- Random Subspace Classification
Increase the accuracy of classification by using a random subspace ensemble.
- Predict Class Labels Using ClassificationEnsemble Predict Block
Train a classification ensemble model with optimal hyperparameters, and then use the ClassificationEnsemble Predict block for label prediction.
- Predict Class Labels Using ClassificationECOC Predict Block
Train an ECOC classification model, and then use the ClassificationECOC Predict block for label prediction. (desde R2023a)