fitensemble adaboost m1 stump picking

I am running fitensemble with adaboostm1 When I view the first tree both right and left decisions are class 1. I do have two classes in my data. I think this has no effect. Why would the algorithm pick such a stump or tree? Decision tree for classification 1 if x3<70.5 then node 2 elseif x3>=70.5 then node 3 else 1 2 class = 1 3 class = 1

Respuestas (1)

Ilya
Ilya el 13 de Sept. de 2012

0 votos

A decision tree in a boosting ensemble by default minimizes the Gini diversity index, not classification error. Two child nodes originating from the same parent can be dominated by the same class. This situation is not uncommon.

1 comentario

Reda
Reda el 21 de Mayo de 2014
Hello, I noticed the same problem as sedar sedar. When adaboosting a classification tree, the learners are all slumps. This is acounter-intuitive, specially that fitting a classification tree with the same parameters gives a much deeper tree. So why is the first learner not as deep ?

Iniciar sesión para comentar.

Etiquetas

Preguntada:

el 13 de Sept. de 2012

Comentada:

el 21 de Mayo de 2014

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by