Feature Selection
Sometimes more is not better, especially in the case of machine learning if we give a lot of unnecessary features, then there is a good chance then the model can overfit with the features. Also, it will take more time to converge as the model has to learn which are the important features and which are not.
Feature Selection algorithms:
- Finding Co-relation between two data or distribution
- Random Forest Importance Features
- Feature Selection using Decision Tree
- L1 or Lasso Regression
- Elastic Net Regression
- Decision Tree
- Forward Feature Selection
- Backward Feature Elimination
Pros:
- Improved model performance
- Reduced overfitting
- Increased interpretability
Cons:
- Increased computational complexity
TODO:
- Learn algorithms for feature selection