Skip to main content

Questions tagged [bagging]

1 vote
1 answer
77 views

The book Hands-On Machine Learning has a section on Out-of-Bag Evaluation related to Decision Trees, where it's stated that, By default a BaggingClassifier samples m training instances with ...
Sahil Gupta's user avatar
0 votes
2 answers
743 views

Every Blog and Youtube video talks about the same steps: Choose that you have to build N number of tree and do the task 2-5 ...
Deshwal's user avatar
  • 323
1 vote
1 answer
54 views

Assume that we have two separate tree regressions. I'm interested in understanding whether the product of tree regressions can be represented by a single tree. Would this be possible?
TFT's user avatar
  • 135
1 vote
1 answer
136 views

I know when model is made to predict a float value, a common approach to report the models validation is using k-fold technique and calculating the average of all folds accuracy (here is a similar ...
morteza's user avatar
  • 11
0 votes
1 answer
46 views

Let's say I have a clear case of overfitting where my loss curves look like this (x axis are iterations): Now I would like to try bagging to reduce the variance, where should I stop models training? ...
dzi's user avatar
  • 111
2 votes
1 answer
436 views

I'm not completly sure about the bias/variance of boosted decision trees (LightGBM especially), thus I wonder if we generally would expect a performance boost by creating an ensemble of multiple ...
CutePoison's user avatar
2 votes
1 answer
1k views

Usually if we have $n$ observations, for each tree with form a bootstrapped subsample of size $n$ with replacement. On googling it one common explanation I've seen is that with replacement sampling is ...
user9343456's user avatar
5 votes
1 answer
755 views

Stacking can be achieved with heterogeneous algorithms such as RF, SVM and KNN. However, can such heterogeneously be achieved in Bagging or Boosting? For example, in Boosting, instead of using RF in ...
Ahmad Bilal's user avatar
0 votes
1 answer
2k views

I found the definition: ...
good_evening's user avatar
1 vote
1 answer
209 views

I've been doing some research on ensemble learning and read that for base models, model with high variance are often recommended (can't remember which book I read this from exactly). But, it seems ...
haneulkim's user avatar
  • 487
2 votes
1 answer
252 views

I am bit confused about two concepts. From my understanding Bagging is when each data is replaced after each choice. so for example for each subset of data you pick one from population, replace it ...
haneulkim's user avatar
  • 487
0 votes
1 answer
63 views

If bagging reduces overfitting than the general statement that base learners of ensemble models should have high bias and low variance(that is should be undefiting) wrong?
Aman Oswal's user avatar
1 vote
1 answer
509 views

The accuracy of my bagging decision tree model reach up to 97% when I set the random seed=5 but the accuracy reduce to only 92% when I set random seed=0. Can someone explain why the huge gap and ...
Farrah 1234's user avatar
2 votes
1 answer
1k views

I recently ran the gradient boosted tree regressor using scikit-learn via: GradientBoostingRegressor() This model depends on the following hyperparameters: Estimators ($N_1$) Min Samples Leaf ($N_2$...
AB_IM's user avatar
  • 123
2 votes
1 answer
62 views

Bagging use decision tree as base classifier. I want to use bagging with decision tree(c4.5) as base as the method that improve decision tree(c4.5) in my research that solve problem overfitting. Is ...
Farrah 1234's user avatar

15 30 50 per page