Entropy
- Entropy is the Expected Value of Surprise
- Less the probability
- More the Surprise
- More the Entropy
Entropy Formula
- Used 2-based log, as there are 2 class
- For N class, we will use
How entropy can be used?
Entropy is used to quantify similarity or differentiate in the number of classes
- Low entropy means high difference in the classes, i.e., 1 class A, 100 class B
- High entropy means low difference in the classes, i.e., 49 class A, 51 class B