Simple and complete tutorial about Adaboost
Table Of content Simple Explanation of Adaboost Step by step understanding of how adaboost works. Number of weak learners required Bias and variance tradeoff in adaboost Parameter optimization in adaboost Feature selection in adaboost SIMPLE EXPLANATION OF ADABOOST Adaboost creates an ensemble of weak learners to create a strong learner. Weak learners are models that achieve accuracy just above random chance on a classification problem. The most suited and therefore most common algorithm used with AdaBoost are decision trees with one level. Because these trees are so short and only contain one decision for classification, they are often called decision stumps. Adboost is a sequential learner. Basically, in adaboost you run all of your data through a weak learner, and try to classify the data. Then in the next iteration, you give more weightage to the incorrectly classified examples in the training data. So, your next weak learner does a[…]
Read More
No comments yet.
Add your comment