8 views (last 30 days)
More Answers (4)
Agustin on 17 Nov 2014
I have written the following code to do cross-validation using TreeBagger (I use the fisheriris dataset):
X = meas;
y = species;
cp = cvpartition(y,'k',10); %10-folds
classF = @(XTRAIN,ytrain,XTEST)(predict(TreeBagger(50,XTRAIN,ytrain),XTEST));
missclasfError = crossval('mcr',X,y,'predfun',classF,'partition',cp);
I hope it is useful.
Ilya on 7 Apr 2012
If you have Statistics Toolbox and MATLAB 9a or later, you can use TreeBagger. Please read the documentation and take a look at the examples. Follow up with a specific question if something remains unclear.
For MLP, take a look at the Neural Network Toolbox.
Ilya on 10 Apr 2012
You can use out-of-bag error as an unbiased estimate of the generalization error. Train TreeBagger with 'oobpred' set to 'on' and call oobError method.
If you insist on using cross-validation, do 'doc crossval' and follow examples there.
Richard Willey on 11 Apr 2012
I did a webinar a couple years titled:
"Computational Statistics: An Introduction to Classification with MATLAB". You can watch the recorded webinar online. The demo code and data sets are available on MATLAB Central.