- Adaboost is looks same like Random Forest, but instead of a full Decision Tree, it uses Stump
- So it can be called Forest of Stumps
 
 
- In contrast to Random Forest, some Stumps get more voting capability than other stumps
 
- In Random Forest, tree were made independently,
- But in AdaBoost, trees are made dependent on the error made from the previous one, same like Boosting
 
 
- Sometimes use the Weighted Gini Index to evaluate decision tree
 
Steps:
- Give same weight (importance) to all data
 
- Create one stump with less total error
 
- calculate 
amount_of_say for this stump 
- Update the weight of the data
- so that misclassified points get more weight
 
- less weight to the correctly classified ones
 
 
- Go to Step 1 and continue
 
- Stop when predetermined 
number_of_estimator has reached 
Evaluation:
- During evaluation, for classification,
- Get the class from each of the estimator
 
- Sum the weight of 
amount_to_say and take the class with most weight 
 
