Forward Feature Selection

In forward feature selection, it starts with empty feature and on every step it add the one feature that has the most affect on prediction (addition of that feature boosted the accuracy most).

Steps:

  1. Initialization with an empty subset
  2. Each feature paired with previous set and evaluated a pre-defined evaluation metric. The feature with the highest performance is selected
  3. Iterated through step 2, unless the pre-defined number of features have not reached

Cons:

  1. It doesn't consider multiple feature interaction (Feature A might not have enough boost on performance, but feature (A, B, D) can boost the performance significantly)
    1. Solution: Backward Feature Elimination

References

  1. https://www.linkedin.com/pulse/forward-selection-powerful-feature-technique-optimal-model-ravi-singh/

Related Notes